Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
An experiment was carried out in the fields of Agriculture College-Baghdad University during spring and autumn of 2015 by using a randomized complete blocks design with three replications. The first season hybridization was established among three pure cultivars of cowpea (Vigna uniguiculata L.) which: Ramshorn, California black eye and Rahawya in full diallel crosses according to Griffing with first method and fixed model (3 parents+ 3 diallel hybrids +3 reciprocal hybrids) and a comparison experiment was in autumn season. The result of statistical analysis showed that there was a significant difference among the parents and their hybrids for all the studied characters. The parent 1 was the higher for root nodules number , leaf number, pod
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThrough recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreAbstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists
Background: The synthesis and characterization of novel liquid crystalline compounds have garnered signi|cant attention due to their potential applications in biomedical sciences, including drug delivery systems, biosensing, and diagnostic tools. This study focuses on synthesizing and characterizing new thiazolothiadiazole-based liquid crystals and evaluating their mesophase properties. Methods: A series of novel compounds containing 5H-thiazolo[4,3−b][1,3,4] thiadiazole units were synthesized via multi-step chemical reactions. The synthesis involved the reaction of chloroethyl acetate with 4−hydroxybenzaldehyde to yield an aldehyde intermediate, followed by subsequent transformations using hydrazine hydrate, ethylacetoacetate, and 1,2
... Show MoreThe research aims to measure the efficiency of health services Quality in the province of Karbala, using the Data Envelopment analysis Models in ( 2006). According to these models the degree of efficiency ranging between zero and unity. We estimate Scale efficiency for two types of orientation direction, which are input and output oriented direction.
The results showed, according Input-oriented efficiency that the levels of Scale efficiency on average is ( 0.975), in the province of Karbala. While the index of Output-oriented efficiency on average is (o.946).
Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreIn this study, SnO2 nanoparticles were prepared from cost-low tin chloride (SnCl2.2H2O) and ethanol by adding ammonia solution by the sol-gel method, which is one of the lowest-cost and simplest techniques. The SnO2 nanoparticles were dried in a drying oven at a temperature of 70°C for 7 hours. After that, it burned in an oven at a temperature of 200°C for 24 hours. The structure, material, morphological, and optical properties of the synthesized SnO2 in nanoparticle sizes are studied utilizing X-ray diffraction. The Scherrer expression was used to compute nanoparticle sizes according to X-ray diffraction, and the results needed to be scrutinized more closely. The micro-strain indicates the broadening of diffraction peaks for nano
... Show MoreAir pollution refers to the release of pollutants into the air that are detrimental to human health and the planet as a whole.In this research, the air pollutants concentration measurements such as Total Suspended Particles(TSP), Carbon Monoxides(CO),Carbon Dioxide (CO2) and meteorological parameters including temperature (T), relative humidity (RH) and wind speed & direction were conducted in Baghdad city by several stations measuring numbered (22) stations located in different regions, and were classified into (industrial, commercial and residential) stations. Using Arc-GIS program ( spatial Analyses), different maps have been prepared for the distribution of different pollutant