There are several oil reservoirs that had severe from a sudden or gradual decline in their production due to asphaltene precipitation inside these reservoirs. Asphaltene deposition inside oil reservoirs causes damage for permeability and skin factor, wettability alteration of a reservoir, greater drawdown pressure. These adverse changing lead to flow rate reduction, so the economic profit will drop. The aim of this study is using local solvents: reformate, heavy-naphtha and binary of them for dissolving precipitated asphaltene inside the oil reservoir. Three samples of the sand pack had been prepared and mixed with a certain amount of asphaltene. Permeability of these samples calculated before and after mixed with asphaltenes. Then, the permeability of samples calculated after solvents injection into that porous media. After that, all the values of samples permeability converted to average permeability damage compared with the pure samples. The results show the average permeability damage of samples that mixed with 20 gm was 24 %, but after reformate injected reduced to 14 %. After injected heavy naphtha to porous media, the average permeability reduced only to 17%. The binary solvent had been prepared from reformatted mixed with heavy naphtha gained the best results because it dropped the average permeability damage to 10%.
The university course timetable problem (UCTP) is typically a combinatorial optimization problem. Manually achieving a useful timetable requires many days of effort, and the results are still unsatisfactory. unsatisfactory. Various states of art methods (heuristic, meta-heuristic) are used to satisfactorily solve UCTP. However, these approaches typically represent the instance-specific solutions. The hyper-heuristic framework adequately addresses this complex problem. This research proposed Particle Swarm Optimizer-based Hyper Heuristic (HH PSO) to solve UCTP efficiently. PSO is used as a higher-level method that selects low-level heuristics (LLH) sequence which further generates an optimal solution. The proposed a
... Show MoreThe effluent quality improvement being discharged from wastewater treatment plants is essential to maintain an environment and healthy water resources. This study was carried out to evaluate the possibility of intermittent slow sand filtration as a promising tertiary treatment method for the sequencing batch reactor (SBR) effluent. Laboratory scale slow sand filter (SSF) of 1.5 UC and 0.1 m/h filtration rate, was used to study the process performance. It was found that SSF IS very efficient in oxidizing organic matter with COD removal efficiency up to 95%, also it is capable of removing considerable amounts of phosphate with 76% and turbidity with 87% removal efficiencies. Slow sand filter efficiently reduced the mass of suspended
... Show MoreThe prepared nanostructure SiO2 thin films were densified by two techniques (conventional and Diode Pumped Solid State Laser (DPSS) (532 nm). X-ray diffraction (XRD), Field Emission Scanning electron microscopy (FESEM), and Atomic Force Microscope (AFM) technique were used to analyze the samples. XRD results showed that the structure of SiO2 thin films was amorphous for both Oven and Laser densification. FESEM and AFM images revealed that the shape of nano silica is spherical and the particle size is in nano range. The small particle size of SiO2 thin film densified by DPSS Laser was (26 nm) , while the smallest particle size of SiO2 thin film densified by Oven was (111 nm).
Many production companies suffers from big losses because of high production cost and low profits for several reasons, including raw materials high prices and no taxes impose on imported goods also consumer protection law deactivation and national product and customs law, so most of consumers buy imported goods because it is characterized by modern specifications and low prices.
The production company also suffers from uncertainty in the cost, volume of production, sales, and availability of raw materials and workers number because they vary according to the seasons of the year.
I had adopted in this research fuzzy linear program model with fuzzy figures
... Show More
Abstract
This research deals with Building A probabilistic Linear programming model representing, the operation of production in the Middle Refinery Company (Dura, Semawa, Najaif) Considering the demand of each product (Gasoline, Kerosene,Gas Oil, Fuel Oil ).are random variables ,follows certain probability distribution, which are testing by using Statistical programme (Easy fit), thes distribution are found to be Cauchy distribution ,Erlang distribution ,Pareto distribution ,Normal distribution ,and General Extreme value distribution . &
... Show MoreDue to increased consumption of resources, especially energy it was necessary to find alternatives characterized by the same quality as well as being of less expensive, and most important of these alternatives are characterized by waste and the fact that humancannot stop consumption. So we have consideredwaste as an alternative and cheap economic resources and by using environmental index the MIP (input materials per unit ,unit / service) is based on the grounds that the product is not the end of itselfit is a product to meet the need of a product or service, awarded a resource input and output within the five basic elements are the raw materials is ecological, Raw materials ecological, water, air and soil erosion for a
... Show MoreThe method of predicting the electricity load of a home using deep learning techniques is called intelligent home load prediction based on deep convolutional neural networks. This method uses convolutional neural networks to analyze data from various sources such as weather, time of day, and other factors to accurately predict the electricity load of a home. The purpose of this method is to help optimize energy usage and reduce energy costs. The article proposes a deep learning-based approach for nonpermanent residential electrical ener-gy load forecasting that employs temporal convolutional networks (TCN) to model historic load collection with timeseries traits and to study notably dynamic patterns of variants amongst attribute par
... Show MoreIn this paper, the maximum likelihood estimates for parameter ( ) of two parameter's Weibull are studied, as well as white estimators and (Bain & Antle) estimators, also Bayes estimator for scale parameter ( ), the simulation procedures are used to find the estimators and comparing between them using MSE. Also the application is done on the data for 20 patients suffering from a headache disease.