Performance of gas-solid spouted bed benefit from solids uniformity structure (UI).Therefore, the focus of this work is to maximize UI across the bed based on process variables. Hence, UI is to be considered as the objective of the optimization process .Three selected process variables are affecting the objective function. These decision variables are: gas velocity, particle density and particle diameter. Steady-state solids concentration measurements were carried out in a narrow 3-inch cylindrical spouted bed made of Plexiglas that used 60° conical shape base. Radial concentration of particles (glass and steel beads) at various bed heights and different flow patterns were measured using sophisticated optical probes. Stochastic Genetic Algorithm (GA) has been found better than deterministic search for study mutation of process variables of the non-linear bed. Spouted bed behaved as hybrid system. Global GA could provide confirmed data and selected best operating conditions. Optimization technique would guide the experimental work and reduce the risk and cost of operation. Optimum results could improve operating of the bed at high-performance and stable conditions. Maximum uniformity has been found at high-density, small size of solid beads and low gas velocity. Density of solids has been effective variable on UI.Velocity of gas and diameter of solid particles has been observed more sensitive decision variables with UI mutations. Uniformity of solid particles would enhance hydrodynamic parameters, heat and mass transfer in the bed because of improving of hold-up and voids distributions of solids. The results of the optimization have been compared with the experimental data using sophisticated optical probe and Computed Tomography technique.
The basic objective of the research is to study the quality of the water flow service in the Directorate of Karbala sewage and how to improve it after identifying the deviations of the processes and the final product and then providing the possible solutions in addressing the causes of the deviations and the associated quality gaps. A number of quality tools were used and applied to all data Stations with areas and activities related to the drainage of rainwater, as the research community determines the stations of lifting rainwater in the Directorate of the streams of Karbala holy, and the station was chosen Western station to apply the non-random sampling method intended after meeting a number of. It is one of the largest and m
... Show MoreThis research foxed on the effect of fire flame of different burning temperatures (300, 400 and 500)oC on the compressive strength of reactive powder concrete (RPC).The steady state duration of the burning test was (60)min. Local consuming material were used to mixed a RPC of compressive strength around (100) MPa. The tested specimens were reinforced by (3.0) cm hooked end steel fiber of (1100) MPa yield strength. Three steel fiber volume fraction were adopted in this study (0, 1.0and 1.5)% and two cooling process were included, gradual and sudden. It was concluding that increasing burning temperature decreases the residual compressive strength for RPC specimens of(0%) steel fiber volume fraction by (12.16, 19.46&24.49) and (18.20, 27.77 &3
... Show MoreAn ultrasonic treatment was applied to the vacuum gas oil at intervals of 5 to 30 minutes, at 70°C. In this work, the improvement of the important properties of Iraqi vacuum gas oil, such as carbon residue, was studied with several parameter conditions that affect vacuum efficiency, such as sonication time (5, 10, 15, 20, 25, and 30) min, power amplitude (10–50%). After ultrasonic treatment, the carbon residue of vacuum gas oil was evaluated using a Conradson carbon residue meter (ASTM D189). The experiment revealed that the oil's carbon residue had decreased by 16%. As a consequence of the experiment It was discovered that ultrasonic treatment might reduce the carbon residual and density of oil samples being studied. It also notice
... Show MoreSustainable crop production in a coarse soil texture is challenging due to high water permeability and low soil water holding capacity. In this paper, subsurface water retention technology (SWRT) through impermeable polyethylene membranes was placed at depth 35 cm below ground surface and within the root zone to evaluate and compare the impact of these membranes and control treatment (without using the membranes) on yield and water use efficiency of eggplant inside the greenhouse. The study was conducted in Al-Fahamah Township, Baghdad, Iraq during spring growing season 2017. Results demonstrated the yield and water use efficiencies were 3.483 kg/m2 and 5.653 kg/m3, respectively for SWRT treatment p
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.