In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in R program by using some existing packages.
The Internet of Things (IoT) has significantly transformed modern systems through extensive connectivity but has also concurrently introduced considerable cybersecurity risks. Traditional rule-based methods are becoming increasingly insufficient in the face of evolving cyber threats. This study proposes an enhanced methodology utilizing a hybrid machine-learning framework for IoT cyber-attack detection. The framework integrates a Grey Wolf Optimizer (GWO) for optimal feature selection, a customized synthetic minority oversampling technique (SMOTE) for data balancing, and a systematic approach to hyperparameter tuning of ensemble algorithms: Random Forest (RF), XGBoost, and CatBoost. Evaluations on the RT-IoT2022 dataset demonstrat
... Show MoreIn this paper, compared eight methods for generating the initial value and the impact of these methods to estimate the parameter of a autoregressive model, as was the use of three of the most popular methods to estimate the model and the most commonly used by researchers MLL method, Barg method and the least squares method and that using the method of simulation model first order autoregressive through the design of a number of simulation experiments and the different sizes of the samples.
Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreRate of penetration plays a vital role in field development process because the drilling operation is expensive and include the cost of equipment and materials used during the penetration of rock and efforts of the crew in order to complete the well without major problems. It’s important to finish the well as soon as possible to reduce the expenditures. So, knowing the rate of penetration in the area that is going to be drilled will help in speculation of the cost and that will lead to optimize drilling outgoings. In this research, an intelligent model was built using artificial intelligence to achieve this goal. The model was built using adaptive neuro fuzzy inference system to predict the rate of penetration in
... Show MoreIn some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MoreAbstract
An experimental study was conducted for measuring the quality of surface finishing roughness using magnetic abrasive finishing technique (MAF) on brass plate which is very difficult to be polish by a conventional machining process where the cost is high and much more susceptible to surface damage as compared to other materials. Four operation parameters were studied, the gap between the work piece and the electromagnetic inductor, the current that generate the flux, the rotational Spindale speed and amount of abrasive powder size considering constant linear feed movement between machine head and workpiece. Adaptive Neuro fuzzy inference system (ANFIS) was implemented for evaluation of a serie
... Show MoreThe Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show More