This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with Arabic and English texts with the same efficiency. The result showed that the proposed method performs faster and more securely when compared to standard DES and AES algorithms.
In this paper, an approach for object tracking that is inspired from human oculomotor system is proposed and verified experimentally. The developed approach divided into two phases, fast tracking or saccadic phase and smooth pursuit phase. In the first phase, the field of the view is segmented into four regions that are analogue to retinal periphery in the oculomotor system. When the object of interest is entering these regions, the developed vision system responds by changing the values of the pan and tilt angles to allow the object lies in the fovea area and then the second phase will activate. A fuzzy logic method is implemented in the saccadic phase as an intelligent decision maker to select the values of the pan and tilt angle based
... Show MoreThe estimation of the parameters of Two Parameters Gamma Distribution in case of missing data has been made by using two important methods: the Maximum Likelihood Method and the Shrinkage Method. The former one consists of three methods to solve the MLE non-linear equation by which the estimators of the maximum likelihood can be obtained: Newton-Raphson, Thom and Sinha methods. Thom and Sinha methods are developed by the researcher to be suitable in case of missing data. Furthermore, the Bowman, Shenton and Lam Method, which depends on the Three Parameters Gamma Distribution to get the maximum likelihood estimators, has been developed. A comparison has been made between the methods in the experimental aspect to find the best meth
... Show MoreThe collected premiums and the compensations paid are among the main variables that have a prominent role in determining the level of financial solvency of insurance companies, as the higher the financial solvency of the insurance company, the more attractive it is to the target audience to acquire the company's insurance services.
Hence the importance of the issue of the solvency of insurance companies, as it is one of the critical matters on which the effectiveness of the insurance company and its continuation in the labor market depend.
In this research, we try to clarify the role of collected premiums and compensations paid in determining the level of operational solvency of t
... Show MoreThe goal of the research is to develop a sustainable rating system for roadway projects in Iraq for all of the life cycle stages of the projects which are (planning, design, construction and operation and maintenance). This paper investigates the criteria and its weightings of the suggested roadway rating system depending on sustainable planning activities. The methodology started in suggesting a group of sustainable criteria for planning stage and then suggesting weights from (1-5) points for each one of it. After that data were collected by using a closed questionnaire directed to the roadway experts group in order to verify the criteria weightings based on the relative importance of the roadway related impacts
... Show MoreA sensitivity-turbidimetric method at (0-180o) was used for detn. of mebeverine in drugs by two solar cell and six source with C.F.I.A.. The method was based on the formation of ion pair for the pinkish banana color precipitate by the reaction of Mebeverine hydrochloride with Phosphotungstic acid. Turbidity was measured via the reflection of incident light that collides on the surface particles of precipitated at 0-180o. All variables were optimized. The linearity ranged of Mebeverine hydrochloride was 0.05-12.5mmol.L-1, the L.D. (S/N= 3)(3SB) was 521.92 ng/sample depending on dilution for the minimum concentration , with correlation coefficient r = 0.9966while was R.S.D%
... Show MoreCarbonate reservoirs are an essential source of hydrocarbons worldwide, and their petrophysical properties play a crucial role in hydrocarbon production. Carbonate reservoirs' most critical petrophysical properties are porosity, permeability, and water saturation. A tight reservoir refers to a reservoir with low porosity and permeability, which means it is difficult for fluids to move from one side to another. This study's primary goal is to evaluate reservoir properties and lithological identification of the SADI Formation in the Halfaya oil field. It is considered one of Iraq's most significant oilfields, 35 km south of Amarah. The Sadi formation consists of four units: A, B1, B2, and B3. Sadi A was excluded as it was not filled with h
... Show MoreA remarkable correlation between chaotic systems and cryptography has been established with sensitivity to initial states, unpredictability, and complex behaviors. In one development, stages of a chaotic stream cipher are applied to a discrete chaotic dynamic system for the generation of pseudorandom bits. Some of these generators are based on 1D chaotic map and others on 2D ones. In the current study, a pseudorandom bit generator (PRBG) based on a new 2D chaotic logistic map is proposed that runs side-by-side and commences from random independent initial states. The structure of the proposed model consists of the three components of a mouse input device, the proposed 2D chaotic system, and an initial permutation (IP) table. Statist
... Show MoreTo expedite the learning process, a group of algorithms known as parallel machine learning algorithmscan be executed simultaneously on several computers or processors. As data grows in both size andcomplexity, and as businesses seek efficient ways to mine that data for insights, algorithms like thesewill become increasingly crucial. Data parallelism, model parallelism, and hybrid techniques are justsome of the methods described in this article for speeding up machine learning algorithms. We alsocover the benefits and threats associated with parallel machine learning, such as data splitting,communication, and scalability. We compare how well various methods perform on a variety ofmachine learning tasks and datasets, and we talk abo
... Show More