Net pay is one of the most important parameters used in determining initial oil in place of a reservoir. It can be delineated through the using of limiting values of the petrophysical properties of the reservoir. Those limiting values are named as the cutoff. This paper provides an insight into the application of regression line method in estimating porosity, clay volume and water saturation cutoff values in Mishrif reservoir/ Missan oil fields. The study included 29 wells distributed in seven oilfields of Halfaya, Buzurgan, Dujaila, Noor, Fauqi, Amara and Kumait.
This study is carried out by applying two types of linear regressions: Least square and Reduce Major Axis Regression.
The Mishrif formation was
... Show MoreVariation in the numbers of pectoral fin spines and rays, pelvic fin rays, gill rakers on the first gill arch, anal fin rays, and the number of vertebrae of Silurus triostegus Heckel were examined in specimens from 16 localities that span its entire distribution range in the Tigris, Euphrates, and Shatt al-Arab rivers in Iraq. The mean number of the six meristic traits increases toward high latitudes with maximum and minimum values in the north and south of Iraq. Based on cluster analysis and PCA, the Mesopotamian river samples were clearly separated into three distinct groups. The upper Tigris populations were isolated from those of the middle and southern populations of this river and from those of
Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreA new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
In this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.
The: currency Auction is one of the monetary policy tools created after 2003, in order to keep pace with the changes that the monetary and financial policies will witness from financial openness and expectations of high levels of liquidity after international economic restrictions. It is necessary to re-evaluate the work of the currency Auction from time to time and observation its efficiency in adjustment the exchange rate And its reflection on the general level of prices as one of the objectives of its inception, and during the analytical aspect, it was confirmed that the currency Auction for selling the currency had a major role in adjustment the exchange rate and controlling inflation levels, due to the market’s dependence
... Show MoreSemi-parametric regression models have been studied in a variety of applications and scientific fields due to their high flexibility in dealing with data that has problems, as they are characterized by the ease of interpretation of the parameter part while retaining the flexibility of the non-parametric part. The response variable or explanatory variables can have outliers, and the OLS approach have the sensitivity to outliers. To address this issue, robust (resistance) methods were used, which are less sensitive in the presence of outlier values in the data. This study aims to estimate the partial regression model using the robust estimation method with the wavel
... Show MoreIn this article, performing and deriving te probability density function for Rayleigh distribution is done by using ordinary least squares estimator method and Rank set estimator method. Then creating interval for scale parameter of Rayleigh distribution. Anew method using is used for fuzzy scale parameter. After that creating the survival and hazard functions for two ranking functions are conducted to show which one is beast.
In this paper, the method of estimating the variation of Zenith Path Delay (ZPD) estimation method will be illustrate and evaluate using Real Time Kinematic Differential Global Positioning System (RTK-DGPS). The GPS provides a relative method to remotely sense atmospheric water vapor in any weather condition. The GPS signal delay in the atmosphere can be expressed as ZPD. In order to evaluate the results, four points had been chosen in the university of Baghdad campus to be rover ones, with a fixed Base point. For each rover position a 155 day of coordinates measurements was collected to overcome the results. Many models and mathematic calculations were used to extract the ZPD using the Matlab environment. The result shows that the ZPD valu
... Show MoreIn this research we assumed that the number of emissions by time (𝑡) of radiation particles is distributed poisson distribution with parameter (𝑡), where < 0 is the intensity of radiation. We conclude that the time of the first emission is distributed exponentially with parameter 𝜃, while the time of the k-th emission (𝑘 = 2,3,4, … . . ) is gamma distributed with parameters (𝑘, 𝜃), we used a real data to show that the Bayes estimator 𝜃 ∗ for 𝜃 is more efficient than 𝜃̂, the maximum likelihood estimator for 𝜃 by using the derived variances of both estimators as a statistical indicator for efficiency