Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Improving in assembling technology has provided machines of higher evaluation with better resistances and managed behavior. This machinery led to remarkably higher dynamic forces and therefore higher stresses. In this paper, a dynamic investigation of rectangular machine diesel and gas engines foundation at the top surface of one-layer dry sand with various states (i.e., loose, medium and dense) was carried out. The dynamic investigation is performed numerically by utilizing limited component programming, PLAXIS 3D. The soil is accepted as flexible totally plastic material submits to Mohr-Coulomb yield basis. A harmonic load is applied at the foundation with amplitude of 10 kPa at a frequency of (10, 15 and 20) HZ and se
... Show MoreTwo samples of (Ag NPs-zeolite) nanocomposite thin films have been prepared by easy hydrothermal method for 4 hours and 8 hours inside the hydrothermal autoclave at temperatures of 100°C. The two samples were used in a photoelectrochemical cell as a photocatalyst inside a cell consisting of three electrodes: the working electrode photoanode (AgNPs-zeolite), platinum as a cathode electrode, and Ag/AgCl as a reference electrode, to study the performance of AgNPs-zeolite under dark current and 473 nm laser light for water splitting. The results show the high performance of an eight-hour sample with high crystallinity compared with a four-hour sample as a reliable photocatalyst to generate hydrogen for renewable energies.
Light naphtha one of the products from distillation column in oil refineries used as feedstock for gasoline production. The major constituents of light naphtha are (Normal Paraffin, Isoparaffin, Naphthene, and Aromatic). In this paper, we used zeolite (5A) with uniform pores size (5Aº) to separate normal paraffin from light naphtha, due to suitable pore size for this process and compare the behavior of adsorption with activated carbon which has a wide range of pores size (micropores and mesopores) and high surface area. The process is done in a continuous system - Fixed bed reactor- at the vapor phase with the constant conditions of flow rate 5 ml/min, temperature 180oC, pressure 1.6 bar and 100-gram weight o
... Show MoreThis paper aims to decide the best parameter estimation methods for the parameters of the Gumbel type-I distribution under the type-II censorship scheme. For this purpose, classical and Bayesian parameter estimation procedures are considered. The maximum likelihood estimators are used for the classical parameter estimation procedure. The asymptotic distributions of these estimators are also derived. It is not possible to obtain explicit solutions of Bayesian estimators. Therefore, Markov Chain Monte Carlo, and Lindley techniques are taken into account to estimate the unknown parameters. In Bayesian analysis, it is very important to determine an appropriate combination of a prior distribution and a loss function. Therefore, two different
... Show More