Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
Diabetes mellitus type 2 (T2DM) is a chronic and progressive condition, which affects people all around the world. The risk of complications increases with age if the disease is not managed properly. Diabetic neuropathy is caused by excessive blood glucose and lipid levels, resulting in nerve damage. Apelin is a peptide hormone that is found in different human organs, including the central nervous system and adipose tissue. The aim of this study is to estimate Apelin levels in diabetes type 2 and Diabetic peripheral Neuropathy (DPN) Iraqi patients and show the extent of peripheral nerve damage. The current study included 120 participants: 40 patients with Diabetes Mellitus, 40 patients with Diabetic peripheral Neuropathy, and 40 healthy
... Show MoreFatty Acid Methyl Ester (FAME) produced from biomass offers several advantages such as renewability and sustainability. The typical production process of FAME is accompanied by various impurities such as alcohol, soap, glycerol, and the spent catalyst. Therefore, the most challenging part of the FAME production is the purification process. In this work, a novel application of bulk liquid membrane (BLM) developed from conventional solvent extraction methods was investigated for the removal of glycerol from FAME. The extraction and stripping processes are combined into a single system, allowing for simultaneous solvent recovery whereby low-cost quaternary ammonium salt-glycerol-based deep eutectic solvent (DES) is used as the membrane phase.
... Show MorePollutants generation is strongly dependant on the firing temperature and reaction rates of the gaseous reactants in the gas turbine combustion chamber. An experimental study is conducted on a two-shaft T200D micro-gas turbine engine in order to evaluate the impact of injecting ethanol directly into the compressor inlet air on the exhaust emissions. The study is carried out in constant speed and constant load engine tests. Generally, the results showed that when ethanol was added in a concentration of 20% by volume of fuel flow; NOx emission was reduced by the half, while CO and UHC emissions were almost doubled with respect to their levels when burning conventional LPG fuel alone.
It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreThe Machine learning methods, which are one of the most important branches of promising artificial intelligence, have great importance in all sciences such as engineering, medical, and also recently involved widely in statistical sciences and its various branches, including analysis of survival, as it can be considered a new branch used to estimate the survival and was parallel with parametric, nonparametric and semi-parametric methods that are widely used to estimate survival in statistical research. In this paper, the estimate of survival based on medical images of patients with breast cancer who receive their treatment in Iraqi hospitals was discussed. Three algorithms for feature extraction were explained: The first principal compone
... Show MoreGround-based active optical sensors (GBAOS) have been successfully used in agriculture to predict crop yield potential (YP) early in the season and to improvise N rates for optimal crop yield. However, the models were found weak or inconsistent due to environmental variation especially rainfall. The objectives of the study were to evaluate if GBAOS could predict YP across multiple locations, soil types, cultivation systems, and rainfall differences. This study was carried from 2011 to 2013 on corn (Zea mays L.) in North Dakota, and in 2017 in potatoes in Maine. Six N rates were used on 50 sites in North Dakota and 12 N rates on two sites, one dryland and one irrigated, in Maine. Two active GBAOS used for this study were GreenSeeker and Holl
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co