In the present study, the removal of zinc from synthetic waste water using emulsion liquid membrane extraction technique was investigated. Synthetic surfactant solution is used as the emulsifying agent. Diphenylthiocarbazon (ditizone) was used as the extracting agent dissolved in carbon tetrachloride as the organic solvent and sulfuric acid is used as the stripping agent. The parameters that influence the extraction percentage of Zn+2 were studied. These are the ratio of volume of organic solvent to volume of aqueous feed (0.5-4), ratio of volume of surfactant solution to volume of aqueous feed (0.2-1.6), pH of the aqueous feed solution (5-10), mixing intensity (100-1000) rpm, concentration of extracting agent (20-400) ppm, surfactant concentration (0.2-2) wt.%, contact time (3-30) min, and concentration of strip phase (0.25-2) M . It was found that 87.4% of Zn+2 can be removed from the aqueous feed solution at the optimum operating conditions. Further studies were carried out on extraction percentages of other toxic metal ions (As+3, Hg+2, Pb+2, Cd+2) by using the same optimum conditions which were obtained for zinc ions except for the pH of the feed solutions. The pH values for best extraction percentages of arsenic, lead, and cadmium were (1, 10, 10) respectively. Maximum extraction percentage of (98.5, 95.5 and 93.8) was obtained for arsenic, lead, and cadmium respectively, while mercury was completely removed from the aqueous feed solution within the acidic pH range.
This study was conducted in the field of the Poultry Research Station of the animal resources Department / office of Agricultural Research / Ministry of Agriculture from the period 4th April to16th May2021.This study was aimed to investigate the effect of using avocado and chia oil and their mixture in broiler diets on the final productive performance and meat cholesterol concentration and measuring meat oxidation indicators after storing it for 60 days. 300 one-day-old (Ross308) chicks were fed on diets that used avocado oil and chia with percentages of 0, 0.2, 0.4, 0.6%, respectively, and their mixture consisting of 0.0, 0.1, 0.2, 0.3 each of avocado and chia oil (50% avocado + 50% chia oil). The experiment included 10 treatments
... Show MoreThis study includes the application of non-parametric methods in estimating the conditional survival function of the Beran method using both the Nadaraya-Waston and the Priestley-chao weights and using data for Interval censored and Right censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy Considering age is continuous variable, through using (MATLAB) use of the (MSE) To compare weights The results showed a superior weight (Nadaraya-Waston) in estimating the survival function and condition of Both for chemotherapy and radiation therapy.
This study is an approach to assign the land area of Kirkuk city [ a city located in the northern of Iraq, 236 kilometers north of Baghdad and 83 kilometers south of Erbil [ Climatic atlas of Iraq, 1941-1970 ] into different multi zones by using Satellite image and Arc Map10.3, zones of different traffic noise pollutions. Land zonings process like what achieved in this paper will help and of it’s of a high interest point for the future of Kirkuk city especially urban
... Show MoreThe aim of this paper is to present the numerical method for solving linear system of Fredholm integral equations, based on the Haar wavelet approach. Many test problems, for which the exact solution is known, are considered. Compare the results of suggested method with the results of another method (Trapezoidal method). Algorithm and program is written by Matlab vergion 7.
This work addressed the assignment problem (AP) based on fuzzy costs, where the objective, in this study, is to minimize the cost. A triangular, or trapezoidal, fuzzy numbers were assigned for each fuzzy cost. In addition, the assignment models were applied on linguistic variables which were initially converted to quantitative fuzzy data by using the Yager’sorankingi method. The paper results have showed that the quantitative date have a considerable effect when considered in fuzzy-mathematic models.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show More