Permeability estimation is a vital step in reservoir engineering due to its effect on reservoir's characterization, planning for perforations, and economic efficiency of the reservoirs. The core and well-logging data are the main sources of permeability measuring and calculating respectively. There are multiple methods to predict permeability such as classic, empirical, and geostatistical methods. In this research, two statistical approaches have been applied and compared for permeability prediction: Multiple Linear Regression and Random Forest, given the (M) reservoir interval in the (BH) Oil Field in the northern part of Iraq. The dataset was separated into two subsets: Training and Testing in order to cross-validate the accuracy and the performance of the algorithms. The random forest algorithm was the most accurate method leading to lowest Root Mean Square Prediction Error (RMSPE) and highest Adjusted R-Square than multiple linear regression algorithm for both training and testing subset respectively. Thus, random Forest algorithm is more trustable in permeability prediction in non-cored intervals and its distribution in the geological model.
Lost circulation or losses in drilling fluid is one of the most important problems in the oil and gas industry, and it appeared at the beginning of this industry, which caused many problems during the drilling process, which may lead to closing the well and stopping the drilling process. The drilling muds are relatively expensive, especially the muds that contain oil-based mud or that contain special additives, so it is not economically beneficial to waste and lose these muds. The treatment of drilling fluid losses is also somewhat expensive as a result of the wasted time that it caused, as well as the high cost of materials used in the treatment such as heavy materials, cement, and others. The best way to deal with drilling fluid losses
... Show MoreEnhanced oil recovery is used in many mature oil reservoirs to increase the oil recovery factor. Surfactant flooding has recently gained interest again. To create micro emulsions at the interface between crude oil and water, surfactant flooding is the injection of surfactants (and co-surfactants) into the reservoir, thus achieving very low interfacial tension, which consequently assists mobilize the trapped oil.
In this study a flooding system, which has been manufactured and described at high pressure. The flooding processes included oil, water and surfactants. 15 core holders has been prepared at first stage of the experiment and filled with washed sand grains 80-500 mm and then packing the sand to obtain sand packs
... Show MoreEach project management system aims to complete the project within its identified objectives: budget, time, and quality. It is achieving the project within the defined deadline that required careful scheduling, that be attained early. Due to the nature of unique repetitive construction projects, time contingency and project uncertainty are necessary for accurate scheduling. It should be integrated and flexible to accommodate the changes without adversely affecting the construction project’s total completion time. Repetitive planning and scheduling methods are more effective and essential. However, they need continuous development because of the evolution of execution methods, essent
Background: Toxoplasmosis is a very common infection caused by the obligate intracellular protozoan parasite. This parasite is called Toxoplasma gondii widely distributed around the world . Toxoplasma gondii can be vertically transmitted to the fetus during pregnancy and may cause wide range of clinical manifestations in the offspring.
Objective: To determine seroprevalence Immunoglobulin G (IgG) and Immunoglobulin M (IgM ) to toxoplasma gondii among pregnant women and to identify the risk factors.
Type of the study: A cross-sectional study.
Methods: A total of 110 blood samples of pregnant women were collected from
... Show MoreDeep Learning Techniques For Skull Stripping of Brain MR Images
General Background: Deep image matting is a fundamental task in computer vision, enabling precise foreground extraction from complex backgrounds, with applications in augmented reality, computer graphics, and video processing. Specific Background: Despite advancements in deep learning-based methods, preserving fine details such as hair and transparency remains a challenge. Knowledge Gap: Existing approaches struggle with accuracy and efficiency, necessitating novel techniques to enhance matting precision. Aims: This study integrates deep learning with fusion techniques to improve alpha matte estimation, proposing a lightweight U-Net model incorporating color-space fusion and preprocessing. Results: Experiments using the AdobeComposition-1k
... Show MoreOne of the diseases on a global scale that causes the main reasons of death is lung cancer. It is considered one of the most lethal diseases in life. Early detection and diagnosis are essential for lung cancer and will provide effective therapy and achieve better outcomes for patients; in recent years, algorithms of Deep Learning have demonstrated crucial promise for their use in medical imaging analysis, especially in lung cancer identification. This paper includes a comparison between a number of different Deep Learning techniques-based models using Computed Tomograph image datasets with traditional Convolution Neural Networks and SequeezeNet models using X-ray data for the automated diagnosis of lung cancer. Although the simple details p
... Show MoreOptical burst switching (OBS) network is a new generation optical communication technology. In an OBS network, an edge node first sends a control packet, called burst header packet (BHP) which reserves the necessary resources for the upcoming data burst (DB). Once the reservation is complete, the DB starts travelling to its destination through the reserved path. A notable attack on OBS network is BHP flooding attack where an edge node sends BHPs to reserve resources, but never actually sends the associated DB. As a result the reserved resources are wasted and when this happen in sufficiently large scale, a denial of service (DoS) may take place. In this study, we propose a semi-supervised machine learning approach using k-means algorithm
... Show MoreAbstact:
Nursery is one of educational institution in the process of developing the
social concepts that it includes a quirking the knowledge and experiences that
help the kid to adjust with environment through arrangement words ,
movements and concrete things which are transferred to the kids so as to
realize these concepts .
Social concepts are numbers of words and statements with social nature
which the kids learn through the family or nursery in order to effect their
educational style of independent and helping the others .
The re searcher adopted this theory because of the little studies in the
filed of social concepts in the nursery.
The aims of the study are as following :
1- building tools for
Total quality management is considered a modern management concept that achieved success in all fields of various industrial and service sectors in advanced countries . One of these sectors is insurance. This concept aims at improving and developing the performance of insurance service continually . It is the gate that can enable radical change in the organization culture inside the company to transform it from using the traditional management style into using the modern style which achieves high quality standard of insurance service . As a result many insurance companies headed to applying the principles of total quality management in their companies . This study aims at raising the standard of the performance of the Iraqi Insurance Com
... Show More