The technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime numbers and the possibility of using integer numbers. While the second branch of the proposal is the multi-key encryption algorithm. The current algorithm provides the ability to use more than two keys. Keys can be any kind of integer number (at least the last key is a prime number), not necessarily to be of the same length. The Encryption process is based on converting the text characters to suggested integer numbers, and these numbers are converted to other numbers by using a multilevel mathematical model many times (a multilevel process depending on the number of keys used), while the decryption process is a one-level process using just one key as the main key, while the other keys used as secondary keys. The messages are encoded before encryption (coded by ASCII or any suggested system). The algorithm can use an unlimited number of keys with a very large size (more than 7500 bytes), at least one of them a prime number. Exponentiation is also used for keys to increase complexity. The experiments proved the robustness of the key exchange protocol and the encryption algorithm in addition to the security. Comparing the suggested method with other methods ensures that the suggested method is more secure and flexible and easy to implement.
In this paper new methods were presented based on technique of differences which is the difference- based modified jackknifed generalized ridge regression estimator(DMJGR) and difference-based generalized jackknifed ridge regression estimator(DGJR), in estimating the parameters of linear part of the partially linear model. As for the nonlinear part represented by the nonparametric function, it was estimated using Nadaraya Watson smoother. The partially linear model was compared using these proposed methods with other estimators based on differencing technique through the MSE comparison criterion in simulation study.
This research develops a new method based on spectral indices and random forest classifier to detect paddy rice areas and then assess their distributions regarding to urban areas. The classification will be conducted on Landsat OLI images and Landsat OLI/Sentinel 1 SAR data. Consequently, developing a new spectral index by analyzing the relative importance of Landsat bands will be calculated by the random forest. The new spectral index has improved depending on the most three important bands, then two additional indices including the normalized difference vegetation index (NDVI), and standardized difference built-up index (NDBI) have been used to extract paddy rice fields from the data. Several experiments being
... Show MoreLibraries, information centers, and everything related to organizing and preparing information need to be periodically re-evaluated in order to stand on the level of quality, which means improving the general reality of these institutions to ensure sufficient satisfaction from beneficiaries of the services provided. This is what was worked on in this research, as one of the most important quality standards in libraries and information centers, LibQUAL+®, was applied in one of the most important and oldest central university libraries, namely the Central Library of the University of Baghdad at its two locations, Al-Jadriya and Al-Waziriya. The sample of beneficiaries to whom the questionnaire was distributed reached 75 beneficiaries distrib
... Show MoreIn the present work, a z-scan technique was used to study the nonlinear optical properties, represented by the nonlinear refractive index and nonlinear absorption coefficients of nanoparticles cadmium sulfide thin film. The sample was prepared by the chemical bath deposition method. Several testing were done including, x-ray, transmission and thickness of thin film. z-Scan experiment was performed at two wavelengths (1064 nm and 532 nm) and different energies. The results showed the effect of self-focusing in the material at higher intensities, which evaluated n2 to be (0.11-0.16) cm2/GW. The effect of two-photon absorption was studied, which evaluated β to be (24-106) cm/GW. In addition, the optical limiting behavior has been studied.
... Show MoreThis study examines experimentally the performance of a horizontal triple concentric tube heat exchanger TCTHE made of copper metal using water as cooling fluid and oil-40 as hot fluid. Hot fluid enters the inner annular tube of the TCTHE in a direction at a temperature of 50, 60 and 70 oC and a flow rate of 20 l/hr. On the other hand, the cooling fluid enters the inner tube and the outer annular tube in the reverse direction (counter current flow) at a temperature of 25 oC and flow rates of 10, 15, 20, 25, 30 and 35 l/hr. The TCTHE is composed of three copper tubes with outer diameters of 34.925 mm, 22.25 mm, and 9.525 mm, and thicknesses of 1.27 mm, 1.143 mm, and 0.762 mm, respectively. TCTHE tube's length was 670
... Show MoreThe research dealt with the subject of measuring the competitive performance of the National Insurance Company and some of its branches (Basra, Ninwa, Kirkuk and Babil), Depending on the Revenue Growth Index at the activity level, and the Revealed Comparative Advantage Index RCAIAt the branch level,To measure the competitiveness of the company And some branches, As the problem of research in the lack of adoption by some companies in the insurance service sector on scientific indicators to measure their competitive performance, The aims of the research is to measure the competitiveness of the National Insurance Company, as well as the competitiveness of its branches according to the scientific method, One of the main Conclusions of the re
... Show MoreThe research discusses the need to find the innovative structures and methodologies for developing Human Capital (HC) in Iraqi Universities. One of the most important of these structures is Communities of Practice (CoPs) which contributes to develop HC by using learning, teaching and training through the conversion speed of knowledge and creativity into practice. This research has been used the comparative approach through employing the methodology of Data Envelopment Analysis (DEA) by using (Excel 2010 - Solver) as a field evidence to prove the role of CoPs in developing HC. In light of the given information, a researcher adopted on an archived preliminary data about (23) colleges at Mosul University as a deliberate sample for t
... Show MoreThe proposal of nonlinear models is one of the most important methods in time series analysis, which has a wide potential for predicting various phenomena, including physical, engineering and economic, by studying the characteristics of random disturbances in order to arrive at accurate predictions.
In this, the autoregressive model with exogenous variable was built using a threshold as the first method, using two proposed approaches that were used to determine the best cutting point of [the predictability forward (forecasting) and the predictability in the time series (prediction), through the threshold point indicator]. B-J seasonal models are used as a second method based on the principle of the two proposed approaches in dete
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreThe question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show More