Software-defined networks (SDN) have a centralized control architecture that makes them a tempting target for cyber attackers. One of the major threats is distributed denial of service (DDoS) attacks. It aims to exhaust network resources to make its services unavailable to legitimate users. DDoS attack detection based on machine learning algorithms is considered one of the most used techniques in SDN security. In this paper, four machine learning techniques (Random Forest, K-nearest neighbors, Naive Bayes, and Logistic Regression) have been tested to detect DDoS attacks. Also, a mitigation technique has been used to eliminate the attack effect on SDN. RF and KNN were selected because of their high accuracy results. Three types of network topology have been generated to observe the effectiveness of proposed algorithms on different network architectures. The results reveal that RF performs better than KNN in a single topology, and both have close performance in other topologies.
This paper proposes a better solution for EEG-based brain language signals classification, it is using machine learning and optimization algorithms. This project aims to replace the brain signal classification for language processing tasks by achieving the higher accuracy and speed process. Features extraction is performed using a modified Discrete Wavelet Transform (DWT) in this study which increases the capability of capturing signal characteristics appropriately by decomposing EEG signals into significant frequency components. A Gray Wolf Optimization (GWO) algorithm method is applied to improve the results and select the optimal features which achieves more accurate results by selecting impactful features with maximum relevance
... Show MoreWith the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
The Internet of Things (IoT) has significantly transformed modern systems through extensive connectivity but has also concurrently introduced considerable cybersecurity risks. Traditional rule-based methods are becoming increasingly insufficient in the face of evolving cyber threats. This study proposes an enhanced methodology utilizing a hybrid machine-learning framework for IoT cyber-attack detection. The framework integrates a Grey Wolf Optimizer (GWO) for optimal feature selection, a customized synthetic minority oversampling technique (SMOTE) for data balancing, and a systematic approach to hyperparameter tuning of ensemble algorithms: Random Forest (RF), XGBoost, and CatBoost. Evaluations on the RT-IoT2022 dataset demonstrat
... Show MoreMachine learning (ML) is a key component within the broader field of artificial intelligence (AI) that employs statistical methods to empower computers with the ability to learn and make decisions autonomously, without the need for explicit programming. It is founded on the concept that computers can acquire knowledge from data, identify patterns, and draw conclusions with minimal human intervention. The main categories of ML include supervised learning, unsupervised learning, semisupervised learning, and reinforcement learning. Supervised learning involves training models using labelled datasets and comprises two primary forms: classification and regression. Regression is used for continuous output, while classification is employed
... Show MoreRecently, the increasing demand to transfer data through the Internet has pushed the Internet infrastructure to the nal edge of the ability of these networks. This high demand causes a deciency of rapid response to emergencies and disasters to control or reduce the devastating effects of these disasters. As one of the main cornerstones to address the data trafc forwarding issue, the Internet networks need to impose the highest priority on the special networks: Security, Health, and Emergency (SHE) data trafc. These networks work in closed and private domains to serve a group of users for specic tasks. Our novel proposed network ow priority management based on ML and SDN fullls high control to give the required ow priority to SHE dat
... Show MoreToday, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.
With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusi
... Show MoreThe investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show More