In recent years, the world witnessed a rapid growth in attacks on the internet which resulted in deficiencies in networks performances. The growth was in both quantity and versatility of the attacks. To cope with this, new detection techniques are required especially the ones that use Artificial Intelligence techniques such as machine learning based intrusion detection and prevention systems. Many machine learning models are used to deal with intrusion detection and each has its own pros and cons and this is where this paper falls in, performance analysis of different Machine Learning Models for Intrusion Detection Systems based on supervised machine learning algorithms. Using Python Scikit-Learn library KNN, Support Vector Machine, Naïve Bayes, Decision Tree, Random Forest, Stochastic Gradient Descent, Gradient Boosting and Ada Boosting classifiers were designed. Performance-wise analysis using Confusion Matrix metric carried out and comparisons between the classifiers were a due. As a case study Information Gain, Pearson and F-test feature selection techniques were used and the obtained results compared to models that use all the features. One unique outcome is that the Random Forest classifier achieves the best performance with an accuracy of 99.96% and an error margin of 0.038%, which supersedes other classifiers. Using 80% reduction in features and parameters extraction from the packet header rather than the workload, a big performance advantage is achieved, especially in online environments.
The research aims to identify the theoretical foundations for measuring and analyzing quality costs and continuous improvement, as well as measuring and analyzing quality costs for the Directorate of Electricity Supply / Middle Euphrates and continuous improvement of the distribution of electrical energy,The problem was represented by the high costs of failure and waste in electrical energy result to the excesses on the network and the missing (lost) energy,Thus, measuring and analyzing quality costs for the distribution of electrical energy and identifying continuous improvement leads to a reduction in missing and an increase in sales, as the research reached many conclusions, the most important of which is the high percentage o
... Show MoreThe study was conducted to show the effect of using dried rumen powder as a source of animal protein in the diets of common carp (Cyprinus carpio L.) on its performance, in the fish laboratory/College of Agricultural Engineering Sciences/University of Baghdad/ for a period of 70 d, 70 fingerlings were used with an average starting weight of 30±3 g, with a live mass rate of 202±2 g, randomly distributed among five treatments, two replicates for each treatment and seven fish for each replicate. Five diets of almost identical protein content and different percentages of addition of dried rumen powder were added. 25% was added to treatment T2 and 50% to treatment T3 and 75% of the treatment T4 and 100% of the treatment T5
... Show MoreThis study aims to investigate the academic leaders’ perceptions towards the degree of availability of the dimensions of organizational immunity systems at the University of Tabuk, as well as to reveal the statistically significant differences between the average responses of the study sample members about the degree of availability of these dimensions at their university due to the variables of (gender, leadership position, and college specialization). To achieve the objectives of the study, a descriptive survey method was used. The study population consisted of (200) male and female leaders who were academic leaders at the University of Tabuk. A questionnaire was used as a tool for collecting data, which its validity and reliability
... Show MoreThe biosorption of lead (II) and chromium (III) onto dead anaerobic biomass (DAB) in single and binary systems has been studied using fixed bed adsorber. A general rate multi- component model (GRM) has been utilized to predict the fixed bed breakthrough curves for single and dual- component system. This model considers both external and internal mass transfer resistances as well as axial dispersion with non-liner multi-component isotherm (Langmuir model). The effects of important parameters, such as flow rate, initial concentration and bed height on the behavior of breakthrough curves have been studied. The equilibrium isotherm model parameters such as maximum uptake capacities for lead (II) and chromium (III) were found to be 35.12 and
... Show MoreAbstract
The research seeks to shed light on green accounting information systems, analyze them, identify sustainability reporting and how to improve it, as well as study the importance of the Iraqi oil sector, analyze it, and work on applying green accounting information systems in order to improve the quality of sustainability reporting. Oil as a branch of the General Corporation for the Distribution of Oil and Gas Products to apply the practical aspect and prove the hypothesis of the research. Explaining the company's role in improving environmental conditions
This study deals with the elimination of methyl orange (MO) from an aqueous solution by utilizing the 3D electroFenton process in a batch reactor with an anode of porous graphite and a cathode of copper foam in the presence of granular activated carbon (GAC) as a third pole, besides, employing response surface methodology (RSM) in combination with Box-Behnk Design (BBD) for studying the effects of operational conditions, such as current density (3–8 mA/cm2), electrolysis time (10–20 min), and the amount of GAC (1–3 g) on the removal efficiency beside to their interaction. The model was veiled since the value of R2 was high (>0.98) and the current density had the greatest influence on the response. The best removal efficiency (MO Re%)
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implement