Many carbonate reservoirs in the world show a tilted in originally oil-water contact (OOWC) which requires a special consideration in the selection of the capillary pressure curves and an understanding of reservoir fluids distribution while initializing the reservoir simulation models.
An analytical model for predicting the capillary pressure across the interface that separates two immiscible fluids was derived from reservoir pressure transient analysis. The model reflected the entire interaction between the reservoir-aquifer fluids and rock properties measured under downhole reservoir conditions.
This model retained the natural coupling of oil reservoirs with the aquifer zone and treated them as an explicit-region composite system; thus the exact solutions of diffusivity equation could be used explicitly for each region. The reservoir-aquifer zones were linked by a capillary transition zone that reflected the pressure difference across the free water level.
The principle of superposition theorem was applied to perform this link across the free water level to estimate the reflected aquifer pressure drop behavior that holds the fluid contacts in their equilibrium positions.
The results of originally oil water contact positions generated by the proposed model were compared with data obtained from a carbonate oil field; the results given by the model showed full agreement with the actual field data.
This paper aims to decide the best parameter estimation methods for the parameters of the Gumbel type-I distribution under the type-II censorship scheme. For this purpose, classical and Bayesian parameter estimation procedures are considered. The maximum likelihood estimators are used for the classical parameter estimation procedure. The asymptotic distributions of these estimators are also derived. It is not possible to obtain explicit solutions of Bayesian estimators. Therefore, Markov Chain Monte Carlo, and Lindley techniques are taken into account to estimate the unknown parameters. In Bayesian analysis, it is very important to determine an appropriate combination of a prior distribution and a loss function. Therefore, two different
... Show MoreObjective: To evaluate the functional outcomes after extended curettage and reconstruction using a combination of bone graft and bone cement (sandwich). Methodology: In this prospective case series 16 skeletally mature patients with primary giant cell tumor around the knee were included. Patients with previous surgically treated, malignant transformation, degenerative knee changes and those presenting with pathological fracture were excluded. The tumor was excised with bone graft filling space beneath the articular cartilage and a block of gel foam was placed over the cortical surface of picked bone graft. Remaining cavity was filled with polymethylmethacrylate cement (sandwich) with or without internal fixation. The func tional evaluation
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreIn this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreDue to the lack of statistical researches in studying with existing (p) of Exogenous Input variables, and there contributed in time series phenomenon as a cause, yielding (q) of Output variables as a result in time series field, to form conceptual idea similar to the Classical Linear Regression that studies the relationship between dependent variable with explanatory variables. So highlight the importance of providing such research to a full analysis of this kind of phenomena important in consumer price inflation in Iraq. Were taken several variables influence and with a direct connection to the phenomenon and analyzed after treating the problem of outliers existence in the observations by (EM) approach, and expand the sample size (n=36) to
... Show MoreIn this work , the effect of chlorinated rubber (additive I), zeolite 3A with chlorinated rubber (additive II), zeolite 4A with chlorinated rubber (additiveIII), and zeolite 5A with chlorinated rubber (additive IV), on flammability for epoxy resin studied, in the weight ratios of (2, 4, 7,10 & 12%) by preparing films of (130x130x3) mm in diameters, three standard test methods used to measure the flame retardation which are ; ASTM : D-2863 , ASTM : D-635 & ASTM : D-3014. Results obtained from these tests indicated that all of them are effective and the additive IV has the highest efficiency as a flame retardant.
The aim of this work is to evaluate the one- electron expectation value from the radial electronic density function D(r1) for different wave function for the 2S state of Be atom . The wave function used were published in 1960,1974and 1993, respectavily. Using Hartree-Fock wave function as a Slater determinant has used the partitioning technique for the analysis open shell system of Be (1s22s2) state, the analyze Be atom for six-pairs electronic wave function , tow of these are for intra-shells (K,L) and the rest for inter-shells(KL) . The results are obtained numerically by using computer programs (Mathcad).
AlPO4 catalysts supported with WO3 were prepared by impregnating the catalysts with ammonium metatungstate. The catalysts were checked by X-ray Diffraction (XRD), AFM, and SEM; also, the catalysts analysis was done by X-Ray (EDX). Finally, the N2 adsorption-desorption was used to measure the pore volume and surface area of the catalyst. The prepared catalyst has a surface area of 185.83 m2/g, pore volume of 0.645 cm3/g at a calcination temperature of 500°C for 3 hrs, and particle size of AlPO4 with an average of 35.36 nm. Transesterification of edible oil using WO3/AlPO4 was performed, it was observed that WO3/AlPO4 catalysts give high conversion of edible oil, and this is attributed to the high surface area, smaller particle size, and the
... Show MoreThe Ground Penetrating Radar (GPR) is frequently used in pavement engineering
for road pavement inspection. The main objective of this work is to validate
nondestructive, quick and powerful measurements using GPR for assessment of subgrade
and asphalt /concrete conditions. In the present study, two different antennas
(250, 500 MHz) were used. The case studies are presented was carried in University
of Baghdad over about 100m of paved road. After data acquisition and radar grams
collection, they have been processed using RadExplorer V1.4 software
implementing different filters with the most effective ones (time zero adjustment and
DC removal) in addition to other interpretation tool parameters.
The interpretatio
The long-term monitoring of land movements represents the most successful application of the Global Navigation Satellite System (GNSS), particularly the Global Positioning System. However, the application of long term monitoring of land movements depends on the availability of homogenous and consistent daily position time series of stations over a period of time. Such time series can be produced very efficiently by using Precise Point Positioning and Double Difference techniques based on particular sophisticated GNSS processing softwares. Nonetheless, these rely on the availability of GNSS products which are precise satellite orbit and clock, and Earth orientation parameters. Unfortunately, several changes and modifications have been mad
... Show More