This study focuses on improving the safety of embankment dams by considering the effects of vibration due to powerhouse operation on the dam body. The study contains two main parts. In the first part, ANSYS-CFX is used to create the three-dimensional (3D) Finite Volume (FV) model of one vertical Francis turbine unit. The 3D model is run by considering various reservoir conditions and the dimensions of units. The Re-Normalization Group (RNG) k-ε turbulence model is employed, and the physical properties of water and the flow characteristics are defined in the turbine model. In the second phases, a 3D finite element (FE) numerical model for a rock-fill dam is created by using ANSYS®, considering the dam connection with its powerhouse represented by four vertical Francis turbines, foundation, and the upstream reservoir. Changing the upstream water table minimum and maximum water levels, standers earth gravity, fluid-solid interface, hydrostatic pressure, and the soil properties are considered. The dam model runs to cover all possibilities for turbines operating in accordance with the reservoir discharge ranges. In order to minimize stresses in the dam body and increase dam safety, this study optimizes the turbine operating system by integrating turbine and dam models.
A microbial study conducted for a number of flour samples (30 samples) Uses in the bakery ovens in various areas of the city of Baghdad, by used the conventional methods used in laboratories in microbial tests and compared with the modern techniqueby usedof BacTrac Device 3400 equipped from SY-LAB Impedance analysersAustrian company.The results of two ways showed (The conventional way and BacTrac Device test)that the total counts of aerobic bacteria, coliform bacteria, StaphylococcusSpp. bacteria, Bacillus cereus bacteria and yeasts and molds,Most of them were within the permissible borders in the Iraqi standard for grain and its products With free samples from SalmonellaSpp. bacteria, and that the screening by BacTrac device are shorten
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreThe study aims to discuss the relation between imported inflation and international trade of Iraqi economy for the period (1990-2015) by using annual data. To achieve the study aim, statistical and Econometrics methods are used through NARDL model to explain non-linear relation because it’s a model assigned to measure non-linear relations and as we know most economic relations are non-linear, beside explaining positive and negative effects of imported inflation, and to reach the research aim deductive approach was adopted through using descriptive method to describe and determine phenomenon. Beside the inductive approach by g statistical and standard tools to get the standard model explains the
... Show MoreABSTRACT
This study was conducted to determine the effect of various levels of hump fat (HF) used in manufacturing of camel, beef and chicken sausage to understand the effect of (HF) on physicochemical composition sausage, Different levels of hump fat (5, 7, and 10 %) were used, physicochemical compositions like (moisture, protein, fat, Ash, water holding capacity, shrinkage, cooking loss and pH) were determined. Results of the study revealed that moisture content showed high significant differences (P≤0.01)among treatments groups, Camel sausage and beef sausage tended to have highest values while chicken sausage reported the lowest value. The study showed no significant difference (P≤0.05) among the
... Show MoreBackground: Many types of instruments and techniques are used in the instrumentation of the root canal system. These instruments and techniques may extrude debris beyond the apical foramen and may cause post-instrumentation complications. The aim of this study was to evaluate the amount of apically extruded debris resulted by using 4 types of nickel-titanium instruments (WaveOne, TRUShape 3D conforming files, Hyflex CM, and One Shape files) during endodontic instrumentation. Materials and methods: Forty freshly extracted human mandibular second premolar with straight canals and a single apex were collected for this study. All teeth were cut to similar lengths. Pre-weighted glass vials were used as collecting containers. Samples were randoml
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More