This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with Arabic and English texts with the same efficiency. The result showed that the proposed method performs faster and more securely when compared to standard DES and AES algorithms.
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
This study aimed to investigate the role of Big Data in forecasting corporate bankruptcy and that is through a field analysis in the Saudi business environment, to test that relationship. The study found: that Big Data is a recently used variable in the business context and has multiple accounting effects and benefits. Among the benefits is forecasting and disclosing corporate financial failures and bankruptcies, which is based on three main elements for reporting and disclosing that, these elements are the firms’ internal control system, the external auditing, and financial analysts' forecasts. The study recommends: Since the greatest risk of Big Data is the slow adaptation of accountants and auditors to these technologies, wh
... Show MoreThe Atmospheric Infrared Sounder (AIRS) on EOS/Aqua satellite provides diverse measurements of Methane (CH4) distribution at different pressure levels in the Earth's atmosphere. The focus of this research is to analyze the vertical variations of (CH4) volume mixing ratio (VMR) time-series data at four Standard pressure levels SPL (925, 850, 600, and 300 hPa) in the troposphere above six cities in Iraq from January 2003 to September 2016. The analysis results of monthly average CH4VMR time-series data show a significant increase between 2003 and 2016, especially from 2009 to 2016; the minimum values of CH4 were in 2003 while the maximum values were in 2016. The vertical distribution of CH4<
... Show MoreThis paper considers a new Double Integral transform called Double Sumudu-Elzaki transform DSET. The combining of the DSET with a semi-analytical method, namely the variational iteration method DSETVIM, to arrive numerical solution of nonlinear PDEs of Fractional Order derivatives. The proposed dual method property decreases the number of calculations required, so combining these two methods leads to calculating the solution's speed. The suggested technique is tested on four problems. The results demonstrated that solving these types of equations using the DSETVIM was more advantageous and efficient
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Information security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-
... Show MoreThis study aims to measure and analyze the direct and indirect effects of the financial variables, namely (public spending, public revenues, internal debt, and external debt), on the non-oil productive sectors with and without bank credit as an intermediate variable, using quarterly data for the period (2004Q1–2021Q4), converted using Eviews 12. To measure the objective of the study, the path analysis method was used using IBM SPSS-AMOS. The study concluded that the direct and indirect effects of financial variables have a weak role in directing bank credit towards the productive sectors in Iraq, which amounted to (0.18), as a result of market risks or unstable expectations in the economy. In addition to the weak credit ratings of borr
... Show MoreThe purpose of this research is to identify the effect of the use of project-based learning in the development of intensive reading skills at middle school students. The experimental design was chosen from one group to suit the nature of the research and its objectives. The research group consisted of 35 students. For the purpose of the research, the following materials and tools were prepared: (List of intensive reading skills, intensive reading skills test, teacher's guide, student book). The results of the study showed that there were statistically significant differences at (0.05) in favor of the post-test performance of intensive reading skills. The statistical analysis also showed that the project-based learning approach has a high
... Show More