For the most reliable and reproducible results for calibration or general testing purposes of two immiscible liquids, such as water in engine oil, good emulsification is vital. This study explores the impact of emulsion quality on the Fourier transform infrared (FT-IR) spectroscopy calibration standards for measuring water contamination in used or in-service engine oil, in an attempt to strengthen the specific guidelines of ASTM International standards for sample preparation. By using different emulsification techniques and readily available laboratory equipment, this work is an attempt to establish the ideal sample preparation technique for reliability, repeatability, and reproducibility for FT-IR analysis while still considering t
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreEarly detection of brain tumors is critical for enhancing treatment options and extending patient survival. Magnetic resonance imaging (MRI) scanning gives more detailed information, such as greater contrast and clarity than any other scanning method. Manually dividing brain tumors from many MRI images collected in clinical practice for cancer diagnosis is a tough and time-consuming task. Tumors and MRI scans of the brain can be discovered using algorithms and machine learning technologies, making the process easier for doctors because MRI images can appear healthy when the person may have a tumor or be malignant. Recently, deep learning techniques based on deep convolutional neural networks have been used to analyze med
... Show MorePsychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from
... Show MoreThis study explores the challenges in Artificial Intelligence (AI) systems in generating image captions, a task that requires effective integration of computer vision and natural language processing techniques. A comparative analysis between traditional approaches such as retrieval- based methods and linguistic templates) and modern approaches based on deep learning such as encoder-decoder models, attention mechanisms, and transformers). Theoretical results show that modern models perform better for the accuracy and the ability to generate more complex descriptions, while traditional methods outperform speed and simplicity. The paper proposes a hybrid framework that combines the advantages of both approaches, where conventional methods prod
... Show MoreRutting is a crucial concern impacting asphalt concrete pavements’ stability and long-term performance, negatively affecting vehicle drivers’ comfort and safety. This research aims to evaluate the permanent deformation of pavement under different traffic and environmental conditions using an Artificial Neural Network (ANN) prediction model. The model was built based on the outcomes of an experimental uniaxial repeated loading test of 306 cylindrical specimens. Twelve independent variables representing the materials’ properties, mix design parameters, loading settings, and environmental conditions were implemented in the model, resulting in a total of 3214 data points. The network accomplished high prediction accuracy with an R
... Show MoreThe radio drama is considered to be one of the arts that is discovered after a long period of theater's discovery. Initially , it was the broad framework of the theater's work when radio was broadcasting the shows on the huge theaters. This beginning encouraged many of the radio specialists to correlate plays with radio and make a novice and distinctive type of art. Thus, radio drama made its first step including the following ( plays, short and long series drama as well as other types of radio arts). Because of the above mentioned , the researcher is stimulating to study directing techniques to process the radio drama script ( Khata'a play as a sample).
The first chapter deals with the
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show More