One of the most difficult issues in the history of communication technology is the transmission of secure images. On the internet, photos are used and shared by millions of individuals for both private and business reasons. Utilizing encryption methods to change the original image into an unintelligible or scrambled version is one way to achieve safe image transfer over the network. Cryptographic approaches based on chaotic logistic theory provide several new and promising options for developing secure Image encryption methods. The main aim of this paper is to build a secure system for encrypting gray and color images. The proposed system consists of two stages, the first stage is the encryption process, in which the keys are generated depending on the chaotic logistic with the image density to encrypt the gray and color images, and the second stage is the decryption, which is the opposite of the encryption process to obtain the original image. The proposed method has been tested on two standard gray and color images publicly available. The test results indicate to the highest value of peak signal-to-noise ratio (PSNR), unified average changing intensity (UACI), number of pixel change rate (NPCR) are 7.7268, 50.2011 and 100, respectively. While the encryption and decryption speed up to 0.6319 and 0.5305 second respectively.
Bored piles settlement behavior under vertical loaded is the main factor that affects the design requirements of single or group of piles in soft soils. The estimation of bored pile settlement is a complicated problem because it depends upon many factors which may include ground conditions, validation of bored pile design method through testing and validation of theoretical or numerical prediction of the settlement value. In this study, a prototype single and bored pile group model of arrangement (1*1, 1*2 and 2*2) for total length to diameter ratios (L/D) is 13.33 and clear spacing three times of diameter, subjected to vertical axial loads. The bored piles model used for the test was 2000
... Show MoreThis study explores the challenges in Artificial Intelligence (AI) systems in generating image captions, a task that requires effective integration of computer vision and natural language processing techniques. A comparative analysis between traditional approaches such as retrieval- based methods and linguistic templates) and modern approaches based on deep learning such as encoder-decoder models, attention mechanisms, and transformers). Theoretical results show that modern models perform better for the accuracy and the ability to generate more complex descriptions, while traditional methods outperform speed and simplicity. The paper proposes a hybrid framework that combines the advantages of both approaches, where conventional methods prod
... Show MoreAbstract
In this work, two algorithms of Metaheuristic algorithms were hybridized. The first is Invasive Weed Optimization algorithm (IWO) it is a numerical stochastic optimization algorithm and the second is Whale Optimization Algorithm (WOA) it is an algorithm based on the intelligence of swarms and community intelligence. Invasive Weed Optimization Algorithm (IWO) is an algorithm inspired by nature and specifically from the colonizing weeds behavior of weeds, first proposed in 2006 by Mehrabian and Lucas. Due to their strength and adaptability, weeds pose a serious threat to cultivated plants, making them a threat to the cultivation process. The behavior of these weeds has been simulated and used in Invas
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
String matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show MoreIn the last few years, the Internet of Things (IoT) is gaining remarkable attention in both academic and industrial worlds. The main goal of the IoT is laying on describing everyday objects with different capabilities in an interconnected fashion to the Internet to share resources and to carry out the assigned tasks. Most of the IoT objects are heterogeneous in terms of the amount of energy, processing ability, memory storage, etc. However, one of the most important challenges facing the IoT networks is the energy-efficient task allocation. An efficient task allocation protocol in the IoT network should ensure the fair and efficient distribution of resources for all objects to collaborate dynamically with limited energy. The canonic
... Show MoreIn this research, the focus was on estimating the parameters on (min- Gumbel distribution), using the maximum likelihood method and the Bayes method. The genetic algorithmmethod was employed in estimating the parameters of the maximum likelihood method as well as the Bayes method. The comparison was made using the mean error squares (MSE), where the best estimator is the one who has the least mean squared error. It was noted that the best estimator was (BLG_GE).
This paper discussed the solution of an equivalent circuit of solar cell, where a single diode model is presented. The nonlinear equation of this model has suggested and analyzed an iterative algorithm, which work well for this equation with a suitable initial value for the iterative. The convergence of the proposed method is discussed. It is established that the algorithm has convergence of order six. The proposed algorithm is achieved with a various values of load resistance. Equation by means of equivalent circuit of a solar cell so all the determinations is achieved using Matlab in ambient temperature. The obtained results of this new method are given and the absolute errors is demonstrated.
This paper presents the theoretical and experimental results of drilling high density
polyethylene sheet with thickness of 1 mm using millisecond Nd:YAG pulsed laser. Effects of laser
parameters including laser energy, pulse duration and peak power were investigated. To describe and
understand the mechanism of the drilling process Comsol multiphysics package version 4.3b was used to
simulate the process. Both of the computational and experimental results indicated that the drilling
process has been carried out successfully and there are two phases introduced in the drilling process,
vaporization and melting. Each portion of these phases depend on the laser parameters used in the
drilling process
In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-
... Show More