The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2,0,0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlation coefficient between the actual and predicted values for fluoride concentration at the six locations, Al-Karakh, East Tigris, Al-Wathbah, AL-Karamah, Al-Rashid and Al-Wahda WTP intakes, was 0.93, 0.82, 0.86, 0.90, 0.83 and 0.89, respectively. Model verification results indicated that the model forecasting outputs rationally estimated the actual monthly fluoride content in the selected locations.
Abstract :
The study aims at building a mathematical model for the aggregate production planning for Baghdad soft drinks company. The study is based on a set of aggregate planning strategies (Control of working hours, storage level control strategy) for the purpose of exploiting the resources and productive capacities available in an optimal manner and minimizing production costs by using (Matlab) program. The most important finding of the research is the importance of exploiting during the available time of production capacity. In the months when the demand is less than the production capacity available for investment. In the subsequent months when the demand exceeds the available energy and to minimize the use of overti
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreMany academics have concentrated on applying machine learning to retrieve information from databases to enable researchers to perform better. A difficult issue in prediction models is the selection of practical strategies that yield satisfactory forecast accuracy. Traditional software testing techniques have been extended to testing machine learning systems; however, they are insufficient for the latter because of the diversity of problems that machine learning systems create. Hence, the proposed methodologies were used to predict flight prices. A variety of artificial intelligence algorithms are used to attain the required, such as Bayesian modeling techniques such as Stochastic Gradient Descent (SGD), Adaptive boosting (ADA), Deci
... Show MoreThree-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show MoreThis research is a pragmatic study of political blame in British and Iraqi Parliaments. It aims to unfold the similarities and/or differences in terms of the pragmatic and pragma-rhetorical strategies used by British and Iraqi politicians when they exchange blame in both offensive and defensive situations. A statistical analysis is conducted to quantitatively support the findings of the pragmatic analysis. The analyses conducted have yielded different results among blame is a process composed of two stages. Each stage is distinct for its pragmatic components and pragma-rhetorical strategies. British and Iraqi MPs at the blame stage tend to utilize impoliteness as their main strategy. However, British and Iraqi MPs perform differently at the
... Show MoreA simple, accurate and rapid method for separation and determination of most commonly usedinsecticides in Iraq [thiamethoxam (Thi), imidacloprid (Imi), indoxacarb (Ind), and abamectin (Aba)] ispresented. The separation was performed by gradient reversed-phase high performance liquidchromatography on a C18 stationary phase column. The method was developed and validated. The-1mobile phase was a mixture of acetonitrile and water using gradient flow. The flow rate was 1.0 mL min .The optimum temperature of separation was 25 ºC. The detection was performed at multiple wavelengths.The analysis time was up to 10.5 minutes with retention times of 3.221, 3.854, 6.385, and 9.452 min for-1the studied insecticides. The linearity was in the range of 0.
... Show More