Distributed Denial of Service (DDoS) attacks on Web-based services have grown in both number and sophistication with the rise of advanced wireless technology and modern computing paradigms. Detecting these attacks in the sea of communication packets is very important. There were a lot of DDoS attacks that were directed at the network and transport layers at first. During the past few years, attackers have changed their strategies to try to get into the application layer. The application layer attacks could be more harmful and stealthier because the attack traffic and the normal traffic flows cannot be told apart. Distributed attacks are hard to fight because they can affect real computing resources as well as network bandwidth. DDoS attacks can also be made with smart devices that connect to the Internet, which can be infected and used as botnets. They use Deep Learning (D.L.) techniques like Convolutional Neural Network (C.N.N.) and variants of Recurrent Neural Networks (R.N.N.), such as Long Short-Term Memory (L.S.T.M.), Bidirectional L.S.T.M., Stacked L.S.T.M., and the Gat G.R.U.. These techniques have been used to detect (DDoS) attacks. The Portmap.csv file from the most recent DDoS dataset, CICDDoS2019, has been used to test D.L. approaches. Before giving the data to the D.L. approaches, the data is cleaned up. The pre-processed dataset is used to train and test the D.L. approaches. In the paper, we show how the D.L. approach works with multiple models and how they compare to each other.
Infrastructure, especially wastewater projects, plays an important role in the life of residential communities. Due to the increasing population growth, there is also a significant increase in residential and commercial facilities. This research aims to develop two models for predicting the cost and time of wastewater projects according to independent variables affecting them. These variables have been determined through a questionnaire distributed to 20 projects under construction in Al-Kut City/ Wasit Governorate/Iraq. The researcher used artificial neural network technology to develop the models. The results showed that the coefficient of correlation R between actual and predicted values were 99.4% and 99 %, MAPE was
... Show MoreThe subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreThe study aims to demonstrate the importance of instructional methods in teaching Arabic language as a second language or teaching the Arabic language to non-native speakers. The study is in line with the tremendous development in the field of knowledge, especially in the field of technology and communication, and the emergence of many electronic media in education in general and language teaching in particular. It employs an image in teaching vocabulary and presenting the experience of the Arabic Language Institute for Non-Speakers-King Abdul-Aziz University. The study follows the descriptive approach to solve the problem represented by the lack of interest in the educational methods when teaching Arabic as a second language. Accordingl
... Show MoreForecasting is one of the important topics in the analysis of time series, as the importance of forecasting in the economic field has emerged in order to achieve economic growth. Therefore, accurate forecasting of time series is one of the most important challenges that we seek to make the best decision, the aim of the research is to suggest employing hybrid models to predict daily crude oil prices. The hybrid model consists of integrating the linear component, which represents Box Jenkins models, and the non-linear component, which represents one of the methods of artificial intelligence, which is the artificial neural network (ANN), support vector regression (SVR) algorithm and it was shown that the proposed hybrid models in the predicti
... Show MoreThis paper presents the Taguchi approach for optimization of hardness for shape memory alloy (Cu-Al-Ni) . The influence of powder metallurgy parameters on hardness has been investigated. Taguchi technique and ANOVA were used for analysis. Nine experimental runs based on Taguchi’s L9 orthogonal array were performed (OA),for two parameters was study (Pressure and sintering temperature) for three different levels (300 ,500 and 700) MPa ,(700 ,800 and 900)oC respectively . Main effect, signal-to-noise (S/N) ratio was study, and analysis of variance (ANOVA) using to investigate the micro-hardness characteristics of the shape memory alloy .after application the result of study shown the hei
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More
