In this study, a genetic algorithm (GA) is used to detect damage in curved beam model, stiffness as well as mass matrices of the curved beam elements is formulated using Hamilton's principle. Each node of the curved beam element possesses seven degrees of freedom including the warping degree of freedom. The curved beam element had been derived based on the Kang and Yoo’s thin-walled curved beam theory. The identification of damage is formulated as an optimization problem, binary and continuous genetic algorithms
(BGA, CGA) are used to detect and locate the damage using two objective functions (change in natural frequencies, Modal Assurance Criterion MAC). The results show the objective function based on change in natural frequency is the best objective and no error was recorded in prediction of location and small error in detecting damage value. Also the result show that the genetic algorithm method are efficient indicating and quantifying single and multiple damage with high precision, and the prediction error for the CGA are less than corresponding value for the BGA.
Based on analyzing the properties of Bernstein polynomials, the extended orthonormal Bernstein polynomials, defined on the interval [0, 1] for n=7 is achieved. Another method for computing operational matrices of derivative and integration D_b and R_(n+1)^B respectively is presented. Also the result of the proposed method is compared with true answers to show the convergence and advantages of the new method.
This study aims to conduct an exhaustive comparison between the performance of human translators and artificial intelligence-powered machine translation systems, specifically examining the top three systems: Spider-AI, Metacate, and DeepL. A variety of texts from distinct categories were evaluated to gain a profound understanding of the qualitative differences, as well as the strengths and weaknesses, between human and machine translations. The results demonstrated that human translation significantly outperforms machine translation, with larger gaps in literary texts and texts characterized by high linguistic complexity. However, the performance of machine translation systems, particularly DeepL, has improved and in some contexts
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreDigital change detection is the process that helps in determining the changes associated with land use and land cover properties with reference to geo-registered multi temporal remote sensing data. In this research change detection techniques have been employed to detect the changes in marshes in south of Iraq for two period the first one from 1973 to 1984 and the other from 1973 to 2014 three satellite images had been captured by land sat in different period. Preprocessing such as geo-registered, rectification and mosaic process have been done to prepare the satellite images for monitoring process. supervised classification techniques such maximum likelihood classification has been used to classify the studied area, change detection aft
... Show MoreAbstract Objective: The underlying molecular basis of ischemic heart diseases (IHDs) has not yet been studied among Iraqi people. This study determined the frequency and types of some cardiovascular genetic risk factors among Iraqi patients with IHDs. Methods: This is a cross-sectional study recruiting 56 patients with acute IHD during a 2-month period excluding patients >50 years and patients with documented hyperlipidemia. Their ages ranged between 18 and 50 years; males were 54 and females were only 2. Peripheral blood samples were aspirated from all patients for troponin I and DNA testing. Molecular analysis to detect 12 common cardiovascular genetic risk factors using CVD StripAssay® (ViennaLab Diagnostics GmbH, Austria) was performed
... Show MoreWith the development of communication technologies for mobile devices and electronic communications, and went to the world of e-government, e-commerce and e-banking. It became necessary to control these activities from exposure to intrusion or misuse and to provide protection to them, so it's important to design powerful and efficient systems-do-this-purpose. It this paper it has been used several varieties of algorithm selection passive immune algorithm selection passive with real values, algorithm selection with passive detectors with a radius fixed, algorithm selection with passive detectors, variable- sized intrusion detection network type misuse where the algorithm generates a set of detectors to distinguish the self-samples. Practica
... Show MoreThe current study was designed to investigate the presence of aflatoxin M1 in 25 samples of pasteurized canned milk which collected randomly from some Iraqi local markets using ELISA technique. Aflatoxin M1 was present in 21 samples, the concentration of aflatoxin M1 ranged from (0.25-50 ppb). UV radiation (365nm wave length) was used for detoxification of aflatoxin M1 (sample with highest concentration /50 ppb of aflatoxin M1 in two different volumes ((25 & 50 ml)) for two different time (15 & 30 min) and 30, 60, 90 cm distance between lamp and milk layer were used for this purpose). Results showed that distance between lamp and milk layer was the most effective parameter in reduction of aflatoxin M1, and whenever the distance increase the
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show More