Predicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes beyond simply predicting lithology to provide a detailed quantification of primary minerals (e.g., calcite and dolomite) as well as secondary ones (e.g., shale and anhydrite). The results show important lithological contrast with the high-porosity layers correlating to possible reservoir areas. The richness of Quanti-Elan's interpretations goes beyond what log analysis alone can reveal. The methodology is described in-depth, discussing the approaches used to train neural networks (e.g., data processing, network architecture). A case study where output of neural network predictions of permeability in a particular oil well are compared with core measurements. The results indicate an exceptional closeness between predicted and actual values, further emphasizing the power of this approach. An extrapolated neural network model using lithology (dolomite and limestone) and porosity as input emphasizes the close match between predicted vs. observed carbonate reservoir permeability. This case study demonstrated the ability of neural networks to accurately characterize and predict permeability in complex carbonate systems. Therefore, the results confirmed that neural networks are a reliable and transformative technology tool for oil reservoirs management, which can help to make future predictive methodologies more efficient hydrocarbon recovery operations.
The formation and structural investigation of three new Mannich bases are reported. The synthesis of these compounds was accomplished via a multicomponent one-pot reaction using CaCl2 as a catalyst. The reaction of the benzaldehyde, m-bromoaniline and cyclohexanone or 4-methylcyclohexanone resulted in the formation of L1 and L3, respectively. The synthesis of L2 was achieved by mixing benzaldehyde, o-bromoaniline and cyclohexanone. The isolated compounds were characterised using a range of analytical and spectroscopic techniques. These include; NMR (1H and 13C-NMR), ESMS, FTIR, electronic spectroscopy, microanalyses and melting points. The NMR data for L1 and L2 indicated the presence of one isomer in solutions, on the NMR time scale. How
... Show MoreGiven a matrix, the Consecutive Ones Submatrix (C1S) problem which aims to find the permutation of columns that maximizes the number of columns having together only one block of consecutive ones in each row is considered here. A heuristic approach will be suggested to solve the problem. Also, the Consecutive Blocks Minimization (CBM) problem which is related to the consecutive ones submatrix will be considered. The new procedure is proposed to improve the column insertion approach. Then real world and random matrices from the set covering problem will be evaluated and computational results will be highlighted.
This paper presents a study of the application of gas lift (GL) to improve oil production in a Middle East field. The field has been experiencing a rapid decline in production due to a drop in reservoir pressure. GL is a widely used artificial lift technique that can be used to increase oil production by reducing the hydrostatic pressure in the wellbore. The study used a full field model to simulate the effects of GL on production. The model was run under different production scenarios, including different water cut and reservoir pressure values. The results showed that GL can significantly increase oil production under all scenarios. The study also found that most wells in the field will soon be closed due to high water cuts. Howev
... Show MoreThis paper deals the prediction of the process of random spatial data of two properties, the first is called Primary variables and the second is called secondary variables , the method that were used in the prediction process for this type of data is technique Co-kriging , the method is usually used when the number of primary variables meant to predict for one of its elements is measured in a particular location a few (because of the cost or difficulty of obtaining them) compare with secondary variable which is the number of elements are available and highly correlated with primary variables, as was the&nbs
... Show MoreGas lift is one of the artificial lift techniques which it is frequently implemented to raise oil production. Conventionally, the oil wells produce depending on the energy of reservoir pressure and solution gas which declines due to continuous production. Therefore, many oil wells after a certain production time become unable to lift oil to the surface. Thus, the continuity of production requires implementation of gas lift which works to decrease the average fluid density in the tubing by injection gas through the annulus into the tubing. This paper aims to get maximum oil production of an Iraqi giant oil field at optimum injected gas rate. The field is located in south of Iraq and in
The aim of the research is to use the data content analysis technique (DEA) in evaluating the efficiency of the performance of the eight branches of the General Tax Authority, located in Baghdad, represented by Karrada, Karkh parties, Karkh Center, Dora, Bayaa, Kadhimiya, New Baghdad, Rusafa according to the determination of the inputs represented by the number of non-accountable taxpayers and according to the categories professions and commercial business, deduction, transfer of property ownership, real estate and tenders, In addition to determining the outputs according to the checklist that contains nine dimensions to assess the efficiency of the performance of the investigated branches by investing their available resources T
... Show MoreThis work aims to see the positive association rules and negative association rules in the Apriori algorithm by using cosine correlation analysis. The default and the modified Association Rule Mining algorithm are implemented against the mushroom database to find out the difference of the results. The experimental results showed that the modified Association Rule Mining algorithm could generate negative association rules. The addition of cosine correlation analysis returns a smaller amount of association rules than the amounts of the default Association Rule Mining algorithm. From the top ten association rules, it can be seen that there are different rules between the default and the modified Apriori algorithm. The difference of the obta
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreThyroid disease is a common disease affecting millions worldwide. Early diagnosis and treatment of thyroid disease can help prevent more serious complications and improve long-term health outcomes. However, thyroid disease diagnosis can be challenging due to its variable symptoms and limited diagnostic tests. By processing enormous amounts of data and seeing trends that may not be immediately evident to human doctors, Machine Learning (ML) algorithms may be capable of increasing the accuracy with which thyroid disease is diagnosed. This study seeks to discover the most recent ML-based and data-driven developments and strategies for diagnosing thyroid disease while considering the challenges associated with imbalanced data in thyroid dise
... Show More