In this study, a genetic algorithm (GA) is used to detect damage in curved beam model, stiffness as well as mass matrices of the curved beam elements is formulated using Hamilton's principle. Each node of the curved beam element possesses seven degrees of freedom including the warping degree of freedom. The curved beam element had been derived based on the Kang and Yoo’s thin-walled curved beam theory. The identification of damage is formulated as an optimization problem, binary and continuous genetic algorithms
(BGA, CGA) are used to detect and locate the damage using two objective functions (change in natural frequencies, Modal Assurance Criterion MAC). The results show the objective function based on change in natural frequency is the best objective and no error was recorded in prediction of location and small error in detecting damage value. Also the result show that the genetic algorithm method are efficient indicating and quantifying single and multiple damage with high precision, and the prediction error for the CGA are less than corresponding value for the BGA.
The aim of this research is to compare traditional and modern methods to obtain the optimal solution using dynamic programming and intelligent algorithms to solve the problems of project management.
It shows the possible ways in which these problems can be addressed, drawing on a schedule of interrelated and sequential activities And clarifies the relationships between the activities to determine the beginning and end of each activity and determine the duration and cost of the total project and estimate the times used by each activity and determine the objectives sought by the project through planning, implementation and monitoring to maintain the budget assessed
... Show MoreIn this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreTo maintain the security and integrity of data, with the growth of the Internet and the increasing prevalence of transmission channels, it is necessary to strengthen security and develop several algorithms. The substitution scheme is the Playfair cipher. The traditional Playfair scheme uses a small 5*5 matrix containing only uppercase letters, making it vulnerable to hackers and cryptanalysis. In this study, a new encryption and decryption approach is proposed to enhance the resistance of the Playfair cipher. For this purpose, the development of symmetric cryptography based on shared secrets is desired. The proposed Playfair method uses a 5*5 keyword matrix for English and a 6*6 keyword matrix for Arabic to encrypt the alphabets of
... Show MoreIn fish, a complex set of mechanisms deal with environmental stresses including hypoxia. In order to probe the hypothesis that hypoxia-induced stress could be manifested in varieties of pathways, a model species, mirror carp (Cyprinus carpio), were chronically exposed to hypoxic condition (dissolved oxygen level: 1.80±0.6mg/l) for 21 days and subsequently allowed to recover under normoxic condition (dissolved oxygen level: 8.2±0.5mg/l) for 7 days. At the end of these exposure periods, an integrated approach was applied to evaluate several endpoints at different levels of biological organisation. These included determination of (i) oxidative damage to DNA in erythrocytes (using modified comet assay), (ii) lipid peroxidation in liver sample
... Show MoreThis study aims to conduct an exhaustive comparison between the performance of human translators and artificial intelligence-powered machine translation systems, specifically examining the top three systems: Spider-AI, Metacate, and DeepL. A variety of texts from distinct categories were evaluated to gain a profound understanding of the qualitative differences, as well as the strengths and weaknesses, between human and machine translations. The results demonstrated that human translation significantly outperforms machine translation, with larger gaps in literary texts and texts characterized by high linguistic complexity. However, the performance of machine translation systems, particularly DeepL, has improved and in some contexts
... Show MoreThe hydroconversion of Iraqi light straight run naphtha was studied on zeolite catalyst. 0.3wt.%Pt/HMOR catalyst was prepared locally and used in the present work. The hydroconversion performed on a continuous fixed-bed laboratory reaction unit. Experiments were performed in the temperature range of 200 to 350°C, pressure range of 3 to 15 bars, LHSV range of 0.5-2.5h-1, and the hydrogen to naphtha ratio of 300.
The results show that the hydroconversion of Iraqi light straight naphtha increases with increase in reaction temperature and decreases with increase in LHSV.
High octane number isomers were formed at low temperature of 240°C. The selectivity of hydroisomerization improved by increasing reaction pressu
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreDigital change detection is the process that helps in determining the changes associated with land use and land cover properties with reference to geo-registered multi temporal remote sensing data. In this research change detection techniques have been employed to detect the changes in marshes in south of Iraq for two period the first one from 1973 to 1984 and the other from 1973 to 2014 three satellite images had been captured by land sat in different period. Preprocessing such as geo-registered, rectification and mosaic process have been done to prepare the satellite images for monitoring process. supervised classification techniques such maximum likelihood classification has been used to classify the studied area, change detection aft
... Show More