Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve Bayesian classifier (NBC) have been enhanced as compared to the dataset before applying the proposed method. Moreover, the results indicated that issa was performed better than the statistical imputation techniques such as deleting the samples with missing values, replacing the missing values with zeros, mean, or random values.
Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show More
Shear and compressional wave velocities, coupled with other petrophysical data, are vital in determining the dynamic modules magnitude in geomechanical studies and hydrocarbon reservoir characterization. But, due to field practices and high running cost, shear wave velocity may not available in all wells. In this paper, a statistical multivariate regression method is presented to predict the shear wave velocity for Khasib formation - Amara oil fields located in South- East of Iraq using well log compressional wave velocity, neutron porosity and density. The accuracy of the proposed correlation have been compared to other correlations. The results show that, the presented model provides accurate
... Show MoreThe current research discusses the topic of the formal data within the methodological framework through defining the research problem, limits and objectives and defining the most important terms mentioned in this research. The theoretical framework in the first section addressed (the concept of the Bauhaus school, the philosophy of the Bauhaus school and the logical bases of this school). The second section dealt with (the most important elements and structural bases of the Bauhaus school) which are considered the most important formal data of this school and their implications on the fabrics and costumes design. The research came up with the most important indicators resulting from the theoretical framework.
Chapter three defined the
Traumatic spinal cord injury is a serious neurological disorder. Patients experience a plethora of symptoms that can be attributed to the nerve fiber tracts that are compromised. This includes limb weakness, sensory impairment, and truncal instability, as well as a variety of autonomic abnormalities. This article will discuss how machine learning classification can be used to characterize the initial impairment and subsequent recovery of electromyography signals in an non-human primate model of traumatic spinal cord injury. The ultimate objective is to identify potential treatments for traumatic spinal cord injury. This work focuses specifically on finding a suitable classifier that differentiates between two distinct experimental
... Show MoreDiscriminant analysis is a technique used to distinguish and classification an individual to a group among a number of groups based on a linear combination of a set of relevant variables know discriminant function. In this research discriminant analysis used to analysis data from repeated measurements design. We will deal with the problem of discrimination and classification in the case of two groups by assuming the Compound Symmetry covariance structure under the assumption of normality for univariate repeated measures data.
... Show More
To maintain the security and integrity of data, with the growth of the Internet and the increasing prevalence of transmission channels, it is necessary to strengthen security and develop several algorithms. The substitution scheme is the Playfair cipher. The traditional Playfair scheme uses a small 5*5 matrix containing only uppercase letters, making it vulnerable to hackers and cryptanalysis. In this study, a new encryption and decryption approach is proposed to enhance the resistance of the Playfair cipher. For this purpose, the development of symmetric cryptography based on shared secrets is desired. The proposed Playfair method uses a 5*5 keyword matrix for English and a 6*6 keyword matrix for Arabic to encrypt the alphabets of
... Show MoreIn this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreIn this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm.