Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThe researchers in this study, which aimed to find out Degree contribution of Iraqi
women in development from the perspective faculty members of the complex Jadriya
University in Baghdad, and the the impact of demographic variables (gender, college, place of
living, school level,) for its contribution.
The study population consisted of some faculty members Jadiriyah compound in
Baghdad's University for the academic year 20012/2013. This study followed a descriptive
approach the field, and the analytical method, the study sample consisted of (250) teaching all
of College of Science for girls and education for girls, and to achieve objective of the study
was the development of a questionnaire, was sure of its sincer
This paper investigated the treatment of textile wastewater polluted with aniline blue (AB) by electrocoagulation process using stainless steel mesh electrodes with a horizontal arrangement. The experimental design involved the application of the response surface methodology (RSM) to find the mathematical model, by adjusting the current density (4-20 mA/cm2), distance between electrodes (0.5-3 cm), salt concentration (50-600 mg/l), initial dye concentration (50-250 mg/l), pH value (2-12 ) and experimental time (5-20 min). The results showed that time is the most important parameter affecting the performance of the electrocoagulation system. Maximum removal efficiency (96 %) was obtained at a current density of 20 mA/cm2, distance be
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreThe gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.
... Show MoreThis study aimed at some of the criteria used to determine the form of the river basins, and exposed the need to modify some of its limitations. In which, the generalization of the elongation and roundness ratio coefficient criterion was modified, which was set in a range between (0-1). This range goes beyond determining the form of the basin, which gives it an elongated or rounded feature, and the ratio has been modified by making it more detailed and accurate in giving the basin a specific form, not only a general characteristic. So, we reached a standard for each of the basins' forms regarding the results of the elongation and circularity ratios. Thus, circular is (1-0.8), and square is (between 0.8-0.6), the blade or oval form is (0.6-0
... Show More