The Internet of Things (IoT) has significantly transformed modern systems through extensive connectivity but has also concurrently introduced considerable cybersecurity risks. Traditional rule-based methods are becoming increasingly insufficient in the face of evolving cyber threats. This study proposes an enhanced methodology utilizing a hybrid machine-learning framework for IoT cyber-attack detection. The framework integrates a Grey Wolf Optimizer (GWO) for optimal feature selection, a customized synthetic minority oversampling technique (SMOTE) for data balancing, and a systematic approach to hyperparameter tuning of ensemble algorithms: Random Forest (RF), XGBoost, and CatBoost. Evaluations on the RT-IoT2022 dataset demonstrate that GWO reduces features from 32 to 21, thereby enhancing computational efficiency and interpretability without compromising accuracy, while customized SMOTE addresses class imbalance and enhances minority-class detection. The optimized RF and XGBoost models were assessed using accuracy, precision, recall, and F1-score metrics, and achieved 100% accuracy with strong generalization. These results highlight the effectiveness of optimization-based feature selection and data balancing in improving IoT security that is extensible to deep learning and ensemble-based approaches.
Distributed Denial of Service (DDoS) attacks on Web-based services have grown in both number and sophistication with the rise of advanced wireless technology and modern computing paradigms. Detecting these attacks in the sea of communication packets is very important. There were a lot of DDoS attacks that were directed at the network and transport layers at first. During the past few years, attackers have changed their strategies to try to get into the application layer. The application layer attacks could be more harmful and stealthier because the attack traffic and the normal traffic flows cannot be told apart. Distributed attacks are hard to fight because they can affect real computing resources as well as network bandwidth. DDoS attacks
... Show MoreIn this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
Text Clustering consists of grouping objects of similar categories. The initial centroids influence operation of the system with the potential to become trapped in local optima. The second issue pertains to the impact of a huge number of features on the determination of optimal initial centroids. The problem of dimensionality may be reduced by feature selection. Therefore, Wind Driven Optimization (WDO) was employed as Feature Selection to reduce the unimportant words from the text. In addition, the current study has integrated a novel clustering optimization technique called the WDO (Wasp Swarm Optimization) to effectively determine the most suitable initial centroids. The result showed the new meta-heuristic which is WDO was employed as t
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreThe goal of this work is to check the presence of PNS (photon number splitting) attack in quantum cryptography system based on BB84 protocol, and to get a maximum secure key length as possible. This was achieved by randomly interleaving decoy states with mean photon numbers of 5.38, 1.588 and 0.48 between the signal states with mean photon numbers of 2.69, 0.794 and 0.24. The average length for a secure key obtained from our system discarding the cases with Eavesdropping was equal to 125 with 20 % decoy states and 82 with 50% decoy states for mean photon number of 0.794 for signal states and 1.588 for decoy states.
The major of DDoS attacks use TCP protocol and the TCP SYN flooding attack is the most common one among them. The SYN Cookie mechanism is used to defend against the TCP SYN flooding attack. It is an effective defense, but it has a disadvantage of high calculations and it doesn’t differentiate spoofed packets from legitimate packets. Therefore, filtering the spoofed packet can effectively enhance the SYN Cookie activity. Hop Count Filtering (HCF) is another mechanism used at the server side to filter spoofed packets. This mechanism has a drawback of being not a perfect and final solution in defending against the TCP SYN flooding attack. An enhanced mechanism of Integrating and combining the SYN Cookie with Hop Count Filtering (HCF) mech
... Show MoreBotnet is a malicious activity that tries to disrupt traffic of service in a server or network and causes great harm to the network. In modern years, Botnets became one of the threads that constantly evolving. IDS (intrusion detection system) is one type of solutions used to detect anomalies of networks and played an increasing role in the computer security and information systems. It follows different events in computer to decide to occur an intrusion or not, and it used to build a strategic decision for security purposes. The current paper
Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show More