Software-defined networking (SDN) presents novel security and privacy risks, including distributed denial-of-service (DDoS) attacks. In response to these threats, machine learning (ML) and deep learning (DL) have emerged as effective approaches for quickly identifying and mitigating anomalies. To this end, this research employs various classification methods, including support vector machines (SVMs), K-nearest neighbors (KNNs), decision trees (DTs), multiple layer perceptron (MLP), and convolutional neural networks (CNNs), and compares their performance. CNN exhibits the highest train accuracy at 97.808%, yet the lowest prediction accuracy at 90.08%. In contrast, SVM demonstrates the highest prediction accuracy of 95.5%. As such, an SVM-based DDoS detection model shows superior performance. This comparative analysis offers a valuable insight into the development of efficient and accurate techniques for detecting DDoS attacks in SDN environments with less complexity and time.
The Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreThis study specifically contributes to the urgent need for novel methods in Training of Trainers (ToT) programs which can be more effective and efficient through incorporation of AI tools. By exploring scenarios in which AI could be used to dramatically advance trainer preparation, knowledge-sharing, and skill-building across sectors, the research aims to understand the possibility. This study uses a mixed-methods approach, it surveys 500 trainers and conducts in-depth interviews with a further 50 ToT program directors across diverse industries to evaluate the impact of AI-enhanced ToT programs. The results showcase that the use of AI has a substantial positive effect on trainer performance and program outcomes. AI-enhanced ToT programs, fo
... Show MoreThe entrance process re-engineering one of the main entrances of administrative and technology appropriate to keep pace with scientific progress and the continuing changes in business environment and for the purpose of achieving the goal sought by the organizations in the pursuit of rapid developments and renewable energy in the market competition by changing its operations and activities of the radical change which contributes to an effective contribution to reducing the cost of product or service taking into account the quality improvement in the management of change to keep the increase value and speed of placing on the market to meet customer needs and desires to achieve a
... Show MoreIn this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreThis article aims to explore the importance of estimating the a semiparametric regression function ,where we suggest a new estimator beside the other combined estimators and then we make a comparison among them by using simulation technique . Through the simulation results we find that the suggest estimator is the best with the first and second models ,wherealse for the third model we find Burman and Chaudhuri (B&C) is best.
Exponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreThe issue of the development of Qur'anic studies for the tasks, and the first thought of minds and multiple understandings, and the most precious ideas, and the alternatives were floated, and varied visions, especially as we live in an era exploding knowledge, and complicate secretions, and scramble his ideas, and to devise intellectual approaches To achieve the witnesses of civilization in the words of the Almighty (to be martyrs on the people) Surat Al - Baqarah / 143 attendance and participation in the achievement and a positive interaction with the participants of humanity and civilization in the light of the meaning (to know) rooms / 13.
Therefore, we must emerge from the one-dimensional view of the Koran to the complex mindset t