Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreImage is an important digital information that used in many internet of things (IoT) applications such as transport, healthcare, agriculture, military, vehicles and wildlife. etc. Also, any image has very important characteristic such as large size, strong correlation and huge redundancy, therefore, encrypting it by using single key Advanced Encryption Standard (AES) through IoT communication technologies makes it vulnerable to many threats, thus, the pixels that have the same values will be encrypted to another pixels that have same values when they use the same key. The contribution of this work is to increase the security of transferred image. This paper proposed multiple key AES algorithm (MECCAES) to improve the security of the tran
... Show MoreIn aspect-based sentiment analysis ABSA, implicit aspects extraction is a fine-grained task aim for extracting the hidden aspect in the in-context meaning of the online reviews. Previous methods have shown that handcrafted rules interpolated in neural network architecture are a promising method for this task. In this work, we reduced the needs for the crafted rules that wastefully must be articulated for the new training domains or text data, instead proposing a new architecture relied on the multi-label neural learning. The key idea is to attain the semantic regularities of the explicit and implicit aspects using vectors of word embeddings and interpolate that as a front layer in the Bidirectional Long Short-Term Memory Bi-LSTM. First, we
... Show MoreDecision-making in Operations Research is the main point in various studies in our real-life applications. However, these different studies focus on this topic. One drawback some of their studies are restricted and have not addressed the nature of values in terms of imprecise data (ID). This paper thus deals with two contributions. First, decreasing the total costs by classifying subsets of costs. Second, improving the optimality solution by the Hungarian assignment approach. This newly proposed method is called fuzzy sub-Triangular form (FS-TF) under ID. The results obtained are exquisite as compared with previous methods including, robust ranking technique, arithmetic operations, magnitude ranking method and centroid ranking method. This
... Show MoreSeepage through earth dams is one of the most popular causes for earth dam collapse due to internal granule movement and seepage transfer. In earthen dams, the core plays a vital function in decreasing seepage through the dam body and lowering the phreatic line. In this research, an alternative soil to the clay soil used in the dam core has been proposed by conducting multiple experiments to test the permeability of silty and sandy soil with different additives materials. Then the selected sandy soil model was used to represent the dam experimentally, employing a permeability device to measure the amount of water that seeps through the dam's body and to represent the seepage line. A numerical model was adopted using Geo-Studio software i
... Show More