Indirect electrochemical oxidation of phenol and its derivatives was investigated by using MnO2 rotating cylinder electrode. Taguchi experimental design method was employed to find the best conditions for the removal efficiency of phenol and its derivatives generated during the process. Two main parameters were investigated, current density (C.D.) and electrolysis time. The removal efficiency was considered as a response for the phenol and other organics removal. An orthogonal array L16, the signal to noise (S/N) ratio, and the analysis of variance were used to test the effect of designated process factors and their levels on the performance of phenol and other organics removal efficiency. The results showed that the current density has the higher influence on performance of organics removal while the electrolysis time has the lower impact on the removal performance. Multiple regressions was utilized to acquire the equation that describes the process and the predicted equation has a correlation coefficient (R2) equal to 98.77%. The best conditions were found to get higher removal efficiency. Removal efficiency higher than 95% can be obtained in the range of C.D. of 96-100 mA/cm2 and electrolysis time of 3.2 to 5 h. The behavior of the chemical oxygen demand (COD) mineralization denotes to a zero order reaction and the rate of reaction controlled by active chlorine reaction not by mass transfer of phenol towards the anode.
In cognitive radio networks, there are two important probabilities; the first probability is important to primary users called probability of detection as it indicates their protection level from secondary users, and the second probability is important to the secondary users called probability of false alarm which is used for determining their using of unoccupied channel. Cooperation sensing can improve the probabilities of detection and false alarm. A new approach of determine optimal value for these probabilities, is supposed and considered to face multi secondary users through discovering an optimal threshold value for each unique detection curve then jointly find the optimal thresholds. To get the aggregated throughput over transmission
... Show MoreIndustrial effluents loaded with heavy metals are a cause of hazards to the humans and other forms of life. Conventional approaches, such as electroplating, ion exchange, and membrane processes, are used for removal of copper, cadmium, and lead and are often cost prohibitive with low efficiency at low metal ion concentration. Biosorption can be considered as an option which has been proven as more efficient and economical for removing the mentioned metal ions. Biosorbents used are fungi, yeasts, oil palm shells, coir pith carbon, peanut husks, and olive pulp. Recently, low cost and natural products have also been researched as biosorbent. This paper presents an attempt of the potential use of Iraqi date pits and Al-Khriet (i.e. substances l
... Show MoreIn this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreIn this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreThe area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
: Sound forecasts are essential elements of planning, especially for dealing with seasonality, sudden changes in demand levels, strikes, large fluctuations in the economy, and price-cutting manoeuvres for competition. Forecasting can help decision maker to manage these problems by identifying which technologies are appropriate for their needs. The proposal forecasting model is utilized to extract the trend and cyclical component individually through developing the Hodrick–Prescott filter technique. Then, the fit models of these two real components are estimated to predict the future behaviour of electricity peak load. Accordingly, the optimal model obtained to fit the periodic component is estimated using spectrum analysis and Fourier mod
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreAspect-Oriented Software Development (AOSD) is a technology that helps achieving
better Separation of Concern (SOC) by providing mechanisms to identify all relevant points
in a program at which aspectual adaptations need to take place. This paper introduces a
banking application using of AOSD with security concern in information hiding.