Forest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data is needed about the weather. Therefore, we need an algorithm that can predict the dryness factor. So, the most significant fire potential can be predicted during the dry season. Moreover, daily prediction of the dry season is needed each day to conduct the best action then a qualified preventive measure can be carried out. The method used in this study is the backpropagation algorithm which has functions for calculating, testing and training the drought factors. By using empirical data, some data are trained and then tested until it can be concluded that 100% of the data already well recognized. Furthermore, some other data tested without training, then the result is 60% of the data match. In general, this algorithm shows promising results and can be applied more to complete several variables supporters.
Abstract---The aim of the current research is to identify the level of logical reasoning skills in chemistry students at the Faculty of Education for Pure Sciences/ Ibn Al-Haytham for the academic year (2021-2022). The differences in their level of skills according to the gender variable (males and females) and the academic stages (first- second - third - fourth). The descriptive approach was adopted because it corresponds to the nature of the research objectives. The research sample consisted of (400 )students selected in a relatively random stratified way. The researcher constructed a logical reasoning test, which includes (6) sub-skills , which is (proportional - probabilistic- synthetic- deductive- logic- variable adjustment). The psych
... Show MoreBackground: Irrigation of the canal system permits removal of residual tissue in the canal anatomy that cannot be reached by instrumentation of the main canals so the aim of this study was to compare and evaluate the efficiency of conventional irrigation system, endoactivator sonic irrigation system,P5 Newtron Satelec passive ultrasonic irrigation and Endovac irrigation system in removing of dentin debris at three levels of root canals and to compare the percentage of dentin debris among the three levels for each irrigation system. Materials and methods: Forty extracted premolars with approximately straight single root canals were randomly distributed into 4 tested groups of 10 teeth each. All canals were prepared with Protaper Universal ha
... Show MoreObjectives: To study the prevalence of rs1799964 (-1031 T/C) and rs361525 (- 238 G/A) SNPs and their effect on the disease activity, severity, and cytokines production in newly diagnosed Iraqi rheumatoid arthritis patients. Patients and Methods: sixty-three patients were diagnosed by a specialist physician while attending the rheumatology unit and twenty control participated. The inflammatory markers were measured and PCR amplification and sequencing were performed to demonstrate TNF-α SNPs. Results: Regarding (-1031 C/T) SNP, the TT genotype and allele C were significantly present in the controls, and the CT genotype was distributed significantly in the patients. The TT genotype was mostly distributed in the mild-moder
... Show MoreIncremental forming is a flexible sheet metal forming process which is performed by utilizing simple tools to locally deform a sheet of metal along a predefined tool path without using of dies. This work presents the single point incremental forming process for producing pyramid geometry and studies the effect of tool geometry, tool diameter, and spindle speed on the residual stresses. The residual stresses were measured by ORIONRKS 6000 test measuring instrument. This instrument was used with four angles of (0º,15º,30º, and 45º) and the average value of residual stresses was determined, the value of the residual stress in the original blanks was (10.626 MPa). The X-ray diffraction technology was used to measure the residual stresses
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.