Predicting vertical stress was indeed useful for controlling geomechanical issues since it allowed for the computation of pore pressure for the formation and the classification of fault regimes. This study provides an in-depth observation of vertical stress prediction utilizing numerous approaches using the Techlog 2015 software. Gardner's method results in incorrect vertical stress values with a problem that this method doesn't start from the surface and instead relies only on sound log data. Whereas the Amoco, Wendt non-acoustic, Traugott, average technique simply needed density log as input and used a straight line as the observed density, this was incorrect for vertical computing stress. The results of these methods show that extrapolated density measurement used an average for the real density. The gradient of an extrapolated method is much better in shallow depth into the vertical stress calculations. The Miller density method had an excellent fit with the real density in deep depth. It has been crucial to calculate vertical stress for the past 40 years because calculating pore pressure and geomechanical building models have employed vertical stress as input. The strongest predictor of vertical stress may have been bulk density. According to these results, the miller and extrapolated techniques may be the best two methods for determining vertical stress. Still, the gradient of an extrapolated method is much more excellent in shallow depth than the miller method. Extrapolated density approach may produce satisfactory results for vertical stress, while miller values are lower than those obtained by extrapolating. This may be due to the poor gradient of this method at shallow depths. Gardner's approach incorrectly displays minimum values of about 4000 psi at great depths. While other methods provide numbers that are similar because these methods use constant bulk density values that start at the surface and continue to the desired depth, this is incorrect.
The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2,0,0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlation coefficien
... Show MorePeak ground acceleration (PGA) is one of the critical factors that affect the determination of earthquake intensity. PGA is generally utilized to describe ground motion in a particular zone and is able to efficiently predict the parameters of site ground motion for the design of engineering structures. Therefore, novel models are developed to forecast PGA in the case of the Iraqi database, which utilizes the particle swarm optimization (PSO) approach. A data set of 187 historical ground-motion recordings in Iraq’s tectonic regions was used to build the explicit proposed models. The proposed PGA models relate to different seismic parameters, including the magnitude of the earthquake (Mw), average shear-wave velocity (VS30), focal depth (FD
... Show MoreThe method of predicting the electricity load of a home using deep learning techniques is called intelligent home load prediction based on deep convolutional neural networks. This method uses convolutional neural networks to analyze data from various sources such as weather, time of day, and other factors to accurately predict the electricity load of a home. The purpose of this method is to help optimize energy usage and reduce energy costs. The article proposes a deep learning-based approach for nonpermanent residential electrical ener-gy load forecasting that employs temporal convolutional networks (TCN) to model historic load collection with timeseries traits and to study notably dynamic patterns of variants amongst attribute par
... Show MoreThe need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat
... Show MoreIn this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show MoreIn this study, genetic algorithm was used to predict the reaction kinetics of Iraqi heavy naphtha catalytic reforming process located in Al-Doura refinery in Baghdad. One-dimensional steady state model was derived to describe commercial catalytic reforming unit consisting of four catalytic reforming reactors in series process.
The experimental information (Reformate composition and output temperature) for each four reactors collected at different operating conditions was used to predict the parameters of the proposed kinetic model. The kinetic model involving 24 components, 1 to 11 carbon atoms for paraffins and 6 to 11 carbon atom for naphthenes and aromatics with 71 reactions. The pre-exponential Arrhenius constants and a
... Show MoreLung cancer is one of the most serious and prevalent diseases, causing many deaths each year. Though CT scan images are mostly used in the diagnosis of cancer, the assessment of scans is an error-prone and time-consuming task. Machine learning and AI-based models can identify and classify types of lung cancer quite accurately, which helps in the early-stage detection of lung cancer that can increase the survival rate. In this paper, Convolutional Neural Network is used to classify Adenocarcinoma, squamous cell carcinoma and normal case CT scan images from the Chest CT Scan Images Dataset using different combinations of hidden layers and parameters in CNN models. The proposed model was trained on 1000 CT Scan Images of cancerous and non-c
... Show MoreThe research aims to identify the possibility of applying environmental fines to commercial shops and restaurants to reduce the environmental pollution represented by the wastes generated from them. The research sample was divided into two groups, including the first (20) commercial shops (meat shops and slaughter it, fruits & vegetables, legumes and accessories) and second (30) Restaurant in the city of Baghdad on both sides of Karkh and Rusafa. The quality of the waste was classified into carton, plastic, aluminum, glass, paper, cork and food waste. The study revealed the possibility of applying environmental fines to restaurants and shops to reduce the waste generated from them throughout the year and to apply continuous monitorin
... Show More