With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusion Detection System (IDS). Success is measured by a variety of metrics, including accuracy, precision, recall, F1-Score, and execution time. Applying feature selection approaches such as Analysis of Variance (ANOVA), Mutual Information (MI), and Chi-Square (Ch-2) reduced execution time, increased detection efficiency and accuracy, and boosted overall performance. All classifiers achieve the greatest performance with 99.99% accuracy and the shortest computation time of 0.0089 seconds while using ANOVA with 10% of features.
It has been shown in ionospheric research that calculation of the total electron content (TEC) is an important factor in global navigation system. In this study, TEC calculation was performed over Baghdad city, Iraq, using a combination of two numerical methods called composite Simpson and composite Trapezoidal methods. TEC was calculated using the line integral of the electron density derived from the International reference ionosphere IRI2012 and NeQuick2 models from 70 to 2000 km above the earth surface. The hour of the day and the day number of the year, R12, were chosen as inputs for the calculation techniques to take into account latitudinal, diurnal and seasonal variation of TEC. The results of latitudinal variation of TE
... Show More
Abstract:
The models of time series often suffer from the problem of the existence of outliers that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of
... Show MoreThe research presents the reliability. It is defined as the probability of accomplishing any part of the system within a specified time and under the same circumstances. On the theoretical side, the reliability, the reliability function, and the cumulative function of failure are studied within the one-parameter Raleigh distribution. This research aims to discover many factors that are missed the reliability evaluation which causes constant interruptions of the machines in addition to the problems of data. The problem of the research is that there are many methods for estimating the reliability function but no one has suitable qualifications for most of these methods in the data such
With the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreIn this paper, a new hybrid algorithm for linear programming model based on Aggregate production planning problems is proposed. The new hybrid algorithm of a simulated annealing (SA) and particle swarm optimization (PSO) algorithms. PSO algorithm employed for a good balance between exploration and exploitation in SA in order to be effective and efficient (speed and quality) for solving linear programming model. Finding results show that the proposed approach is achieving within a reasonable computational time comparing with PSO and SA algorithms.
Algorithms using the second order of B -splines [B (x)] and the third order of B -splines [B,3(x)] are derived to solve 1' , 2nd and 3rd linear Fredholm integro-differential equations (F1DEs). These new procedures have all the useful properties of B -spline function and can be used comparatively greater computational ease and efficiency.The results of these algorithms are compared with the cubic spline function.Two numerical examples are given for conciliated the results of this method.
Brachytherapy treatment is primarily used for the certain handling kinds of cancerous tumors. Using radionuclides for the study of tumors has been studied for a very long time, but the introduction of mathematical models or radiobiological models has made treatment planning easy. Using mathematical models helps to compute the survival probabilities of irradiated tissues and cancer cells. With the expansion of using HDR-High dose rate Brachytherapy and LDR-low dose rate Brachytherapy for the treatment of cancer, it requires fractionated does treatment plan to irradiate the tumor. In this paper, authors have discussed dose calculation algorithms that are used in Brachytherapy treatment planning. Precise and less time-consuming calculations
... Show MoreIn this Paper, we proposed two new predictor corrector methods for solving Kepler's equation in hyperbolic case using quadrature formula which plays an important and significant rule in the evaluation of the integrals. The two procedures are developed that, in two or three iterations, solve the hyperbolic orbit equation in a very efficient manner, and to an accuracy that proves to be always better than 10-15. The solution is examined with and with grid size , using the first guesses hyperbolic eccentric anomaly is and , where is the eccentricity and is the hyperbolic mean anomaly.
Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreThe research deals with a modern concept in its applications and the studies it deals with, as the concept of urban densification is one of the most recent sustainable development strategies for cities.
Studies looking at the relationship between condensation and viability show mixed results. This study sheds light on how the built environment of dense urban areas affects the perceived quality of life of the population. How to enhance acceptance of dense life is an important question to investigate.
Adopting the concept of urban densification in city planning policies to be more sustainable and livable is of great importance by achieving efficient use of urban land and limiting urban sprawl, as well as reducing the
... Show More