In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in R program by using some existing packages.
Profit is a goal sought by all banks because it brings them income and guarantees them survival and continuity, and on the other hand, facing commitments without financial crisis. Hence the idea of research in his quest to build scientific tools and means that can help bank management in particular, investors, lenders and others to predict financial failure and to detect early financial failures. The research has produced a number of conclusions, the most important of which is that all Islamic banks sample a safe case of financial failure under the Altman model, while according to the Springate model all Islamic banks sample a search for a financial failure except the Islamic Bank of Noor Iraq for Investment and Finance )BINI(. A
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreIn the present work, the nuclear shell model with Hartree–Fock (HF) calculations have been used to investigate the nuclear structure of 24Mg nucleus. Particularly, elastic and inelastic electron scattering form factors and transition probabilities have been calculated for low-lying positive and negative states. The sd and sdpf shell model spaces have been used to calculate the one-body density matrix elements (OBDM) for positive and negative parity states respectively. Skyrme-Hartree-Fock (SHF) with different parameterizations has been tested with shell model calculation as a single particle potential for reproducing the experimental data along with a harmonic oscillator (HO) and Woods-Saxo
... Show MoreWith the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusi
... Show MoreA substantial portion of today’s multimedia data exists in the form of unstructured text. However, the unstructured nature of text poses a significant task in meeting users’ information requirements. Text classification (TC) has been extensively employed in text mining to facilitate multimedia data processing. However, accurately categorizing texts becomes challenging due to the increasing presence of non-informative features within the corpus. Several reviews on TC, encompassing various feature selection (FS) approaches to eliminate non-informative features, have been previously published. However, these reviews do not adequately cover the recently explored approaches to TC problem-solving utilizing FS, such as optimization techniques.
... Show MoreQuality of e-service is one of the critical factors that decide the success or failure of organizations. It may increase competitive advantages as well as enhance the relationships with the customers. Achieving high e-service quality and user satisfaction are challenging since they depend fundamentally on user perception and expectation which can be tricky at times. To date, there is no agreement as to what service quality is, and how it should be measured, whether it is a function of statistical measures of quality including physical defects or managerial judgment, or it is a function of customer perception about the services. This paper deep-dived the quality of e-services offered b
The aim of this study was to propose and evaluate an eco-epidemiological model with Allee effect and nonlinear harvesting in predators. It was assumed that there is an SI-type of disease in prey, and only portion of the prey would be attacked by the predator due to the fleeing of the remainder of the prey to a safe area. It was also assumed that the predator consumed the prey according to modified Holling type-II functional response. All possible equilibrium points were determined, and the local and global stabilities were investigated. The possibility of occurrence of local bifurcation was also studied. Numerical simulation was used to further evaluate the global dynamics and the effects of varying parameters on the asymptotic behavior of
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show More