The effect of the initial pressure upon the laminar flame speed, for a methane-air mixtures, has been detected paractically, for a wide range of equivalence ratio. In this work, a measurement system is designed in order to measure the laminar flame speed using a constant volume method with a thermocouples technique. The laminar burning velocity is measured, by using the density ratio method. The comparison of the present work results and the previous ones show good agreement between them. This indicates that the measurements and the calculations employed in the present work are successful and precise
The study aims to study the geographical distribution of electricpower plants in Iraq, except the governorates of Kurdistan Region (Dohuk, Erbil, Sulaymaniyah) due to lack of data.
In order to reach the goal of the research was based on some mathematical equations and statistical methods to determine how the geographical distribution of these stations (gas, hydropower, steam, diesel) within the provinces and the concentration of them as well as the possibility of the classification of power plants in Iraq to facilitate understanding of distribution in a scientific manner is characterized by objectively.
The most important results of the research are that there are a number of factors that led to the irregular distribution
... Show MoreThe complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Tra
... Show MoreRadiotherapy is medical use of ionizing radiation, and commonly applied to the cancerous tumor because of its ability to control cell growth. The amount of radiation used in photon radiation therapy called dose (measured in grey unit), which depend on the type and stage of cancer being treated. In our work, we studied the dose distribution given to the tumor at different depths (zero-20 cm) treated with different field size (4×4- 23×23 cm). Results show that the deeper treated area has less dose rate at the same beam quality and quantity. Also it has been noted increasing in the field increasing in the depth dose at the same depth even if the radiation energy is constant. Increasing in radiation dose attributed to the scattere
... Show MoreIn the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by surface chemical reactio
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show More