The shear strength of soil is one of the most important soil properties that should be identified before any foundation design. The presence of gypseous soil exacerbates foundation problems. In this research, an approach to forecasting shear strength parameters of gypseous soils based on basic soil properties was created using Artificial Neural Networks. Two models were built to forecast the cohesion and the angle of internal friction. Nine basic soil properties were used as inputs to both models for they were considered to have the most significant impact on soil shear strength, namely: depth, gypsum content, passing sieve no.200, liquid limit, plastic limit, plasticity index, water content, dry unit weight, and initial voids ratio. Multi-layer perceptron training by the backpropagation algorithm was used in creating the network. It was found that both models can predict shear strength parameters for gypseous soils with good reliability. Sensitivity analysis of the first model indicated that dry unit weight and plasticity index have the most significant effect on the predicted cohesion. While in the second model, the results indicated that the gypsum content and plasticity index have the most significant effect on the predicted angle of internal friction.
In this study, gamma-ray spectrometry with an HPGe detector was used to measure the specific activity concentrations of 226Ra, 232Th, and 40K in soil samples collected from IT1 oil reservoirs in Kirkuk city, northeast Iraq. The “spectral line Gp” gamma analysis software package was used to analyze the spectral data. 226Ra specific activity varies from 9 0.34 Bq.kg-1 to 17 0.47 Bq.kg-1. 232Th specific activity varies from 6.2 0.08 Bq.kg-1 to 18 0.2 Bq.kg-1. 40K specific activity varies from 25 0.19 Bq.kg-1 to 118 0.41 Bq.kg-1. The radiological hazard due to the radiation emitted from natural r
... Show MorePermeability estimation is a vital step in reservoir engineering due to its effect on reservoir's characterization, planning for perforations, and economic efficiency of the reservoirs. The core and well-logging data are the main sources of permeability measuring and calculating respectively. There are multiple methods to predict permeability such as classic, empirical, and geostatistical methods. In this research, two statistical approaches have been applied and compared for permeability prediction: Multiple Linear Regression and Random Forest, given the (M) reservoir interval in the (BH) Oil Field in the northern part of Iraq. The dataset was separated into two subsets: Training and Testing in order to cross-validate the accuracy
... Show MoreThe permeability is the most important parameter that indicates how efficient the reservoir fluids flow through the rock pores to the wellbore. Well-log evaluation and core measurements techniques are typically used to estimate it. In this paper, the permeability has been predicted by using classical and Flow zone indicator methods. A comparison between the two methods shows the superiority of the FZI method correlations, these correlations can be used to estimate permeability in un-cored wells with a good approximation.
The open hole well log data (Resistivity, Sonic, and Gamma Ray) of well X in Euphrates subzone within the Mesopotamian basin are applied to detect the total organic carbon (TOC) of Zubair Formation in the south part of Iraq. The mathematical interpretation of the logs parameters helped in detecting the TOC and source rock productivity. As well, the quantitative interpretation of the logs data leads to assigning to the organic content and source rock intervals identification. The reactions of logs in relation to the increasing of TOC can be detected through logs parameters. By this way, the TOC can be predicted with an increase in gamma-ray, sonic, neutron, and resistivity, as well as a decrease in the density log
... Show MoreThis article proposes a new technique for determining the rate of contamination. First, a generative adversarial neural network (ANN) parallel processing technique is constructed and trained using real and secret images. Then, after the model is stabilized, the real image is passed to the generator. Finally, the generator creates an image that is visually similar to the secret image, thus achieving the same effect as the secret image transmission. Experimental results show that this technique has a good effect on the security of secret information transmission and increases the capacity of information hiding. The metric signal of noise, a structural similarity index measure, was used to determine the success of colour image-hiding t
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreWhen optimizing the performance of neural network-based chatbots, determining the optimizer is one of the most important aspects. Optimizers primarily control the adjustment of model parameters such as weight and bias to minimize a loss function during training. Adaptive optimizers such as ADAM have become a standard choice and are widely used for their invariant parameter updates' magnitudes concerning gradient scale variations, but often pose generalization problems. Alternatively, Stochastic Gradient Descent (SGD) with Momentum and the extension of ADAM, the ADAMW, offers several advantages. This study aims to compare and examine the effects of these optimizers on the chatbot CST dataset. The effectiveness of each optimizer is evaluat
... Show MoreAuthors in this work design efficient neural networks, which are based on the modified Levenberg - Marquardt (LM) training algorithms to solve non-linear fourth - order three -dimensional partial differential equations in the two kinds in the periodic and in the non-periodic - Periodic. Software reliability growth models are essential tools for monitoring and evaluating the evolution of software reliability. Software defect detection events that occur during testing and operation are often treated as counting processes in many current models. However, when working with large software systems, the error detection process should be viewed as a random process with a continuous state space, since the number of faults found during testin
... Show MoreIn this paper, a cognitive system based on a nonlinear neural controller and intelligent algorithm that will guide an autonomous mobile robot during continuous path-tracking and navigate over solid obstacles with avoidance was proposed. The goal of the proposed structure is to plan and track the reference path equation for the autonomous mobile robot in the mining environment to avoid the obstacles and reach to the target position by using intelligent optimization algorithms. Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithms are used to finding the solutions of the mobile robot navigation problems in the mine by searching the optimal paths and finding the reference path equation of the optimal
... Show More