The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
The current research aims to determine the relationship of the impact of the components of the financing structure, especially financing through debts, as well as the earnings per share in the value of the shares of companies listed in the Iraq Stock Exchange. The research sample and identifying the strength of the combined effect of the ratio of financing through debt and earnings per share in maximizing The market value of the firm and the real value, as well as the variation between these relationships according to model of the real value of the companies and the market value of the research sample companies. The research community is represented by the Iraq Stock Exchange, while a conditional deliberate sample
... Show MoreThe novel severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) caused a pandemic of coronavirus disease 2019 (COVID-19) which represents a global public health crisis. Based on recent published studies, this review discusses current evidence related to the transmission, clinical characteristics, diagnosis, management and prevention of COVID-19. It is hoped that this review article will provide a benefit for the public to well understand and deal with this new virus, and give a reference for future researches.
One of the recent significant but challenging research studies in computational biology and bioinformatics is to unveil protein complexes from protein-protein interaction networks (PPINs). However, the development of a reliable algorithm to detect more complexes with high quality is still ongoing in many studies. The main contribution of this paper is to improve the effectiveness of the well-known modularity density ( ) model when used as a single objective optimization function in the framework of the canonical evolutionary algorithm (EA). To this end, the design of the EA is modified with a gene ontology-based mutation operator, where the aim is to make a positive collaboration between the modularity density model and the proposed
... Show MoreGas-lift technique plays an important role in sustaining oil production, especially from a mature field when the reservoirs’ natural energy becomes insufficient. However, optimally allocation of the gas injection rate in a large field through its gas-lift network system towards maximization of oil production rate is a challenging task. The conventional gas-lift optimization problems may become inefficient and incapable of modelling the gas-lift optimization in a large network system with problems associated with multi-objective, multi-constrained, and limited gas injection rate. The key objective of this study is to assess the feasibility of utilizing the Genetic Algorithm (GA) technique to optimize t
Incremental sheet metal forming is a modern technique of sheet metal forming in which a uniform sheet is locally deformed during the progressive action of a forming tool. The tool movement is governed by a CNC milling machine. The tool locally deforms by this way the sheet with pure deformation stretching. In SPIF process, the research is concentrate on the development of predict models for estimate the product quality. Using simulated annealing algorithm (SAA), Surface quality in SPIF has been modeled. In the development of this predictive model, spindle speed, feed rate and step depth have been considered as model parameters. Maximum peak height (Rz) and Arithmetic mean surface roughness (Ra) are used as response parameter to assess th
... Show MoreIn this study, a genetic algorithm (GA) is used to detect damage in curved beam model, stiffness as well as mass matrices of the curved beam elements is formulated using Hamilton's principle. Each node of the curved beam element possesses seven degrees of freedom including the warping degree of freedom. The curved beam element had been derived based on the Kang and Yoo’s thin-walled curved beam theory. The identification of damage is formulated as an optimization problem, binary and continuous genetic algorithms
(BGA, CGA) are used to detect and locate the damage using two objective functions (change in natural frequencies, Modal Assurance Criterion MAC). The results show the objective function based on change in natural frequency i
Multipoint forming process is an engineering concept which means that the working surface of the punch and die is produced as hemispherical ends of individual active elements (called pins), where each pin can be independently, vertically displaced using a geometrically reconfigurable die. Several different products can be made without changing tools saved precious production time. Also, the manufacturing of very expensive rigid dies is reduced, and a lot of expenses are saved. But the most important aspects of using such types of equipment are the flexibility of the tooling. This paper presents an experimental investigation of the effect of three main parameters which are blank holder, rubber thickness and forming speed th
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show More