Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
Porosity and permeability are the most difficult properties to determine in subsurface reservoir characterization. The difficulty of estimating them arising from the fact that porosity and permeability may vary significantly over the reservoir volume, and can only be sampled at well location. Secondly, the porosity values are commonly evaluated from the well log data, which are usually available from most wells in the reservoir, but permeability values, which are generally determined from core analysis, are not usually available. The aim of this study is: First, to develop correlations between the core and the well log data which can be used to estimate permeability in uncored wells, these correlations enable to estimate reservoir permeabil
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreOptical fiber chemical sensor based surface Plasmon resonance for sensing and measuring the refractive index and concentration for Acetic acid is designed and implemented during this work. Optical grade plastic optical fibers with a diameter of 1000μm were used with a diameter core of 980μm and a cladding of 20μm, where the sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal and the Acetic acid is placed on the sensing probe.
This research adopts the estimation of mass transfer coefficient in batch packed bed distillation column as function of physical properties, liquid to vapour molar rates ratio (L / V), relative volatility (α), ratio of vapour and liquid diffusivities (DV / DL), ratio of vapour and liquid densities (ρV / ρL), ratio of vapour and liquid viscosities (μV/ μL).
The experiments are done using binary systems, (Ethanol Water), (Methanol Water), (Methanol Ethanol), (Benzene Hexane), (Benzene Toluene). Statistical program (multiple regression analysis) is used for estimating the overall mass transfer coefficient of vapour and liquid phases (KOV and KOL) in a correlation which represented the data fairly well.
KOV = 3.3 * 10-10
... Show MoreCost estimation is considered one of the important tasks in the construction projects management. The precise estimation of the construction cost affect on the success and quality of a construction project. Elemental estimation is considered a very important stage to the project team because it represents one of the key project elements. It helps in formulating the basis to strategies and execution plans for construction and engineering. Elemental estimation, which in the early stage, estimates the construction costs depending on . minimum details of the project so that it gives an indication for the initial design stage of a project. This paper studies the factors that affect the elemental cost estimation as well as the rela
... Show MoreKidney tumors are of different types having different characteristics and also remain challenging in the field of biomedicine. It becomes very important to detect the tumor and classify it at the early stage so that appropriate treatment can be planned. Accurate estimation of kidney tumor volume is essential for clinical diagnoses and therapeutic decisions related to renal diseases. The main objective of this research is to use the Computer-Aided Diagnosis (CAD) algorithms to help the early detection of kidney tumors that addresses the challenges of accurate kidney tumor volume estimation caused by extensive variations in kidney shape, size and orientation across subjects.
In this paper, have tried to implement an automated segmentati
A rapid high performance liquid chromatography method for the determination of sphinganine (Sa) and sphingosine (So) in urine samples by employing a silica-based monolithic column is described. The samples were first extracted using ethyl acetate and derivatized using ortho-phthaldialdehyde in the presence of 2-mercaptoethanol. C20 sphinganine was used as internal standard. Under the optimized conditions, separation was achieved using a mixture of methanol:water (93:7, v/v), column temperature at 30°C, flow rate of 1 mL min−1, and an injection volume of 10 μL. Good linearity was obtained for Sa and So over the concentration range 20–500 ng mL−1(correlation coefficients ≥0.9978). The detection limits were 0.45 ng mL−1 for Sa and
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe present paper addresses cultivation of Chlorella vulgaris microalgae using airlift photobioreactor that sparged with 5% CO 2 /air. The experimental data were compared with that obtained from bioreactor aerated with air and unsparged bioreactor. The results showed that the concentration of biomass is 0.36 g l -1 in sparged bioreactor with CO2/air, while, the concentration of biomass reached to 0.069 g l -1 in the unsparged bioreactor. They showed also that aerated ioreactor.with CO2/air gives more biomass production even the bioreactor was aerated with air. This study proved that application of sparging system for ultivation of Chlorella vulgaris microalgae using either CO2/air mixture or air has a significant
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show More