There are several oil reservoirs that had severe from a sudden or gradual decline in their production due to asphaltene precipitation inside these reservoirs. Asphaltene deposition inside oil reservoirs causes damage for permeability and skin factor, wettability alteration of a reservoir, greater drawdown pressure. These adverse changing lead to flow rate reduction, so the economic profit will drop. The aim of this study is using local solvents: reformate, heavy-naphtha and binary of them for dissolving precipitated asphaltene inside the oil reservoir. Three samples of the sand pack had been prepared and mixed with a certain amount of asphaltene. Permeability of these samples calculated before and after mixed with asphaltenes. Then, the permeability of samples calculated after solvents injection into that porous media. After that, all the values of samples permeability converted to average permeability damage compared with the pure samples. The results show the average permeability damage of samples that mixed with 20 gm was 24 %, but after reformate injected reduced to 14 %. After injected heavy naphtha to porous media, the average permeability reduced only to 17%. The binary solvent had been prepared from reformatted mixed with heavy naphtha gained the best results because it dropped the average permeability damage to 10%.
In this paper, the method of estimating the variation of Zenith Path Delay (ZPD) estimation method will be illustrate and evaluate using Real Time Kinematic Differential Global Positioning System (RTK-DGPS). The GPS provides a relative method to remotely sense atmospheric water vapor in any weather condition. The GPS signal delay in the atmosphere can be expressed as ZPD. In order to evaluate the results, four points had been chosen in the university of Baghdad campus to be rover ones, with a fixed Base point. For each rover position a 155 day of coordinates measurements was collected to overcome the results. Many models and mathematic calculations were used to extract the ZPD using the Matlab environment. The result shows that the ZPD valu
... Show More<p class="0abstract">Image denoising is a technique for removing unwanted signals called the noise, which coupling with the original signal when transmitting them; to remove the noise from the original signal, many denoising methods are used. In this paper, the Multiwavelet Transform (MWT) is used to denoise the corrupted image by Choosing the HH coefficient for processing based on two different filters Tri-State Median filter and Switching Median filter. With each filter, various rules are used, such as Normal Shrink, Sure Shrink, Visu Shrink, and Bivariate Shrink. The proposed algorithm is applied Salt& pepper noise with different levels for grayscale test images. The quality of the denoised image is evaluated by usi
... Show MoreIn this work, the antibacterial effectiveness of face masks made from polypropylene, against Candida albicans and Pseudomonas aeruginosa pathogenic was improved by soaking in gold nanoparticles suspension prepared by a one-step precipitation method. The fabricated nanoparticles at different concentrations were characterized by UV-visible absorption and showed a broad surface Plasmon band at around 520 nm. The FE-SEM images showed the polypropylene fibres highly attached with the spherical AuNPs of diameters around 25 nm over the surfaces of the soaked fibres. The Fourier Transform Infrared Spectroscopy (FTIR) of pure and treated face masks in AuNPs conform to the characteristics bands for the polypropylene bands. There are some differences
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreMost intrusion detection systems are signature based that work similar to anti-virus but they are unable to detect the zero-day attacks. The importance of the anomaly based IDS has raised because of its ability to deal with the unknown attacks. However smart attacks are appeared to compromise the detection ability of the anomaly based IDS. By considering these weak points the proposed
system is developed to overcome them. The proposed system is a development to the well-known payload anomaly detector (PAYL). By
combining two stages with the PAYL detector, it gives good detection ability and acceptable ratio of false positive. The proposed system improve the models recognition ability in the PAYL detector, for a filtered unencrypt
The use of essential services in modern constructions, such pipes, and ducts, became important, placing these pipes and ducts underneath the soffit of the beam. They made a ceiling sandwich, and that causes to reduce the height of the floor, so the presence of the opening in the beam saves the height of the floor. In this paper, the investigation of the beam response of reinforced concrete simply supported rectangle beams with square web openings is presented, including a number of the web openings (two, four, and eight), in addition to its use in strengthening the member at the openings (when the beam is planned before casting, internal deformation steel bar is used, and in case of the opening is existing in the b
... Show MoreAspect-based sentiment analysis is the most important research topic conducted to extract and categorize aspect-terms from online reviews. Recent efforts have shown that topic modelling is vigorously used for this task. In this paper, we integrated word embedding into collapsed Gibbs sampling in Latent Dirichlet Allocation (LDA). Specifically, the conditional distribution in the topic model is improved using the word embedding model that was trained against (customer review) training dataset. Semantic similarity (cosine measure) was leveraged to distribute the aspect-terms to their related aspect-category cognitively. The experiment was conducted to extract and categorize the aspect terms from SemEval 2014 dataset.
The Purpose of this Research show gap between a Normal Cost System and Resource consumption Accounting Applied in AL-Rafidin Bank.
The Research explores that, how the idle capacity can be determined under resource consumption accounting, discuss the possibility of employing these energies. Research also viewed how costs can be separated into Committee and Attribute. Resource Consumption Accounting assists managers in pricing services or products based on what these services or products use from each Source.
This Research has been proven
In this paper, an approximate solution of nonlinear two points boundary variational problem is presented. Boubaker polynomials have been utilized to reduce these problems into quadratic programming problem. The convergence of this polynomial has been verified; also different numerical examples were given to show the applicability and validity of this method.