Predicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes beyond simply predicting lithology to provide a detailed quantification of primary minerals (e.g., calcite and dolomite) as well as secondary ones (e.g., shale and anhydrite). The results show important lithological contrast with the high-porosity layers correlating to possible reservoir areas. The richness of Quanti-Elan's interpretations goes beyond what log analysis alone can reveal. The methodology is described in-depth, discussing the approaches used to train neural networks (e.g., data processing, network architecture). A case study where output of neural network predictions of permeability in a particular oil well are compared with core measurements. The results indicate an exceptional closeness between predicted and actual values, further emphasizing the power of this approach. An extrapolated neural network model using lithology (dolomite and limestone) and porosity as input emphasizes the close match between predicted vs. observed carbonate reservoir permeability. This case study demonstrated the ability of neural networks to accurately characterize and predict permeability in complex carbonate systems. Therefore, the results confirmed that neural networks are a reliable and transformative technology tool for oil reservoirs management, which can help to make future predictive methodologies more efficient hydrocarbon recovery operations.
Its well known that understanding human facial expressions is a key component in understanding emotions and finds broad applications in the field of human-computer interaction (HCI), has been a long-standing issue. In this paper, we shed light on the utilisation of a deep convolutional neural network (DCNN) for facial emotion recognition from videos using the TensorFlow machine-learning library from Google. This work was applied to ten emotions from the Amsterdam Dynamic Facial Expression Set-Bath Intensity Variations (ADFES-BIV) dataset and tested using two datasets.
In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreIn the current digitalized world, cloud computing becomes a feasible solution for the virtualization of cloud computing resources. Though cloud computing has many advantages to outsourcing an organization’s information, but the strong security is the main aspect of cloud computing. Identity authentication theft becomes a vital part of the protection of cloud computing data. In this process, the intruders violate the security protocols and perform attacks on the organizations or user’s data. The situation of cloud data disclosure leads to the cloud user feeling insecure while using the cloud platform. The different traditional cryptographic techniques are not able to stop such kinds of attacks. BB84 protocol is the first quantum cry
... Show MoreThe biggest problem of structural materials for fusion reactor is the damage caused by the fusion product neutrons to the structural material. If this problem is overcomed, an important milestone will be left behind in fusion energy. One of the important problems of the structural material is that nuclei forming the structural material interacting with fusion neutrons are transmuted to stable or radioactive nuclei via (n, x) (x; alpha, proton, gamma etc.) reactions. In particular, the concentration of helium gas in the structural material increases through deuteron- tritium (D-T) and (n, α) reactions, and this increase significantly changes the microstructure and the properties of the structural materials. T
... Show MoreThis paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b
... Show MoreAbstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreIn recent years, social media has been increasing widely and obviously as a media for users expressing their emotions and feelings through thousands of posts and comments related to tourism companies. As a consequence, it became difficult for tourists to read all the comments to determine whether these opinions are positive or negative to assess the success of a tourism company. In this paper, a modest model is proposed to assess e-tourism companies using Iraqi dialect reviews collected from Facebook. The reviews are analyzed using text mining techniques for sentiment classification. The generated sentiment words are classified into positive, negative and neutral comments by utilizing Rough Set Theory, Naïve Bayes and K-Nearest Neighbor
... Show MoreThe two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival
... Show MorePrecision is one of the main elements that control the quality of a geodetic network, which defines as the measure of the network efficiency in propagation of random errors. This research aims to solve ZOD and FOD problems for a geodetic network using Rosenbrock Method to optimize the geodetic networks by using MATLAB programming language, to find the optimal design of geodetic network with high precision. ZOD problem was applied to a case study network consists of 19 points and 58 designed distances with a priori deviation equal to 5mm, to determine the best points in the network to consider as control points. The results showed that P55 and P73 having the minimum ellipse of error and considered as control points. FOD problem was applie
... Show More