Protecting information sent through insecure internet channels is a significant challenge facing researchers. In this paper, we present a novel method for image data encryption that combines chaotic maps with linear feedback shift registers in two stages. In the first stage, the image is divided into two parts. Then, the locations of the pixels of each part are redistributed through the random numbers key, which is generated using linear feedback shift registers. The second stage includes segmenting the image into the three primary colors red, green, and blue (RGB); then, the data for each color is encrypted through one of three keys that are generated using three-dimensional chaotic maps. Many statistical tests (entropy, peak signal-noise ratio (PSNR), mean square error (MSE) and correlation) were conducted on a group of images to determine the strength and efficiency of the proposed method, and the result proves that the proposed method provided a good level of safety. The obtained results were compared with those of other methods, and the result of comparing confirms the superiority of the proposed method.
Chaotic features of nuclear energy spectrum in 68Ge nucleus are investigated by nuclear shell model. The energies are calculated through doing shell model calculations employing the OXBASH computer code with effective interaction of F5PVH. The 68Ge nucleus is supposed to have an inert core of 56Ni with 12 nucleons (4 protons and 8 neutrons) move in the f5p-model space ( and ). The nuclear level density of considered classes of states is seen to have a Gaussian form, which is in accord with the prediction of other theoretical studies. The statistical fluctuations of the energy spectrum (the level spacing P(s) and the Dyson-Mehta (or statistics) are well described by the Gaussian orthogonal ens
... Show Morein this article, we present a definition of k-generalized map independent of non-expansive map and give infinite families of non-expansive and k-generalized maps new iterative algorithms. Such algorithms are also studied in the Hilbert spaces as the potential to exist for asymptotic common fixed point.
Clinical keratoconus (KCN) detection is a challenging and time-consuming task. In the diagnosis process, ophthalmologists must revise demographic and clinical ophthalmic examinations. The latter include slit-lamb, corneal topographic maps, and Pentacam indices (PI). We propose an Ensemble of Deep Transfer Learning (EDTL) based on corneal topographic maps. We consider four pretrained networks, SqueezeNet (SqN), AlexNet (AN), ShuffleNet (SfN), and MobileNet-v2 (MN), and fine-tune them on a dataset of KCN and normal cases, each including four topographic maps. We also consider a PI classifier. Then, our EDTL method combines the output probabilities of each of the five classifiers to obtain a decision b
|
Building Information Modeling (BIM) is becoming a great known established collaboration process in Architecture, Engineering, and Construction (AEC) industry. In various cases in many countries, potential benefits and competitive advantages have been reported. However, despite the potentials and benefits of BIM technologies, it is not applied in the construction sector in Iraq just like many other countries of the world. The purpose of this research is to understand the uses and benefits of BIM for construction projects in Iraq. This purpose has been done by establishing a fr |
The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show More