This study focused on the improvement of the quality of gasoline and enhancing its octane number by the reduction of n-paraffins using zeolite 5A. This study was made using batch and continuous mode. The parameters which affected the n-paraffin removal efficiency for each mode were studied. Temperature (30 and 40 ˚C) and mixing time up to 120 min for different amounts of zeolite ranging (10-60 g) were investigated in a batch mode. A maximum removal efficiency of 64% was obtained using 60 g of zeolite at 30 ˚C after a mixing time 120 min. The effect of feed flow rate (0.3-0.8 l/hr) and bed height (10-20 cm) were also studied in a continuous mode. The equilibrium isotherm study was made using different amounts of zeolite (2-20 g) and the
... Show MoreThis research concern to analyse and simulate the temperature distribution in the spot welding joints using tungsten arc welding shielded with inert gas (TIG Spot) for the aluminum-magnesium alloy type (5052-O).
The effect of and the quantity of the heat input that enter the weld zone has been investigated welding current, welding time and arc length on temperature distribution. The finite element method (by utilizing programme ANSYS 5.4) is presented the temperature distribution in a circular weld pool and the weld pool penetration (depth of welding) through the top sheet ,across the interface into the lower sheet forming a weld spot. &nbs
... Show MoreThe kinetics of nickel removal from aqueous solutions using a bio-electrochemical reactor with a packed bed rotating cylinder cathode was investigated. The effects of applied voltage, initial nickel concentration, the rotation speed of the cathode, and pH on the reaction rate constant (k) were studied. The results showed that the cathodic deposition occurred under mass transfer control for all values of the applied voltage used in this research. Accordingly, the relationship between concentration and time can be represented by a first-order equation. The rate constant was found to be dependent on the applied voltage, initial nickel concentration, pH, and rotation speed. It was increased as the applied voltage increased and decreased as t
... Show MoreA group of acceptance sampling to testing the products was designed when the life time of an item follows a log-logistics distribution. The minimum number of groups (k) required for a given group size and acceptance number is determined when various values of Consumer’s Risk and test termination time are specified. All the results about these sampling plan and probability of acceptance were explained with tables.
The basic objective of the research is to study the quality of the water flow service in the Directorate of Karbala sewage and how to improve it after identifying the deviations of the processes and the final product and then providing the possible solutions in addressing the causes of the deviations and the associated quality gaps. A number of quality tools were used and applied to all data Stations with areas and activities related to the drainage of rainwater, as the research community determines the stations of lifting rainwater in the Directorate of the streams of Karbala holy, and the station was chosen Western station to apply the non-random sampling method intended after meeting a number of. It is one of the largest and m
... Show MoreBackground: The possibility of converting the organic fraction of municipal solid waste to mature compost using the composting bin method was studied. Nine distinct treatments were created by combining municipal solid waste (MSW) with animal waste (3:1, 2:1), poultry manure (3:1, 2:1), mixed waste (2:1:1), agricultural waste (dry leaves), biocont (Trichoderm hazarium), and humic acid. Weekly monitoring of temperature, pH, EC, organic matter (OM percent), and the C/N ratio was performed, and macronutrients (N, P, K) were measured. Trace elements, including heavy metals (Cd and Pb), were tested in the first and final weeks of maturity. Results: Temperatures in the first days of composting reached the thermophilic phase in MSW compost
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show More