Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve Bayesian classifier (NBC) have been enhanced as compared to the dataset before applying the proposed method. Moreover, the results indicated that issa was performed better than the statistical imputation techniques such as deleting the samples with missing values, replacing the missing values with zeros, mean, or random values.
Electrical Discharge Machining (EDM) is a widespread Nontraditional Machining (NTM) processes for manufacturing of a complicated geometry or very hard metals parts that are difficult to machine by traditional machining operations. Electrical discharge machining is a material removal (MR) process characterized by using electrical discharge erosion. This paper discusses the optimal parameters of EDM on high-speed steel (HSS) AISI M2 as a workpiece using copper and brass as an electrode. The input parameters used for experimental work are current (10, 24 and 42 A), pulse on time (100, 150 and 200 µs), and pulse off time (4, 12 and 25 µs) that have effect on the material removal rate (MRR), electrode wear rate (EWR) and wear ratio (WR). A
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreNano gamma alumina was prepared by double hydrolysis process using aluminum nitrate nano hydrate and sodium aluminate as an aluminum source, hydroxyle poly acid and CTAB (cetyltrimethylammonium bromide) as templates. Different crystallization temperatures (120, 140, 160, and 180) 0C and calcinations temperatures (500, 550, 600, and 650) 0C were applied. All the batches were prepared at PH equals to 9. XRD diffraction technique and infrared Fourier transform spectroscopy were used to investigate the phase formation and the optical properties of the nano gamma alumina. N2 adsorption-desorption (BET) was used to measure the surface area and pore volume of the prepared nano alumina, the particle size and the
... Show MoreIn modern technology, the ownership of electronic data is the key to securing their privacy and identity from any trace or interference. Therefore, a new identity management system called Digital Identity Management, implemented throughout recent years, acts as a holder of the identity data to maintain the holder’s privacy and prevent identity theft. Therefore, an overwhelming number of users have two major problems, users who own data and third-party applications will handle it, and users who have no ownership of their data. Maintaining these identities will be a challenge these days. This paper proposes a system that solves the problem using blockchain technology for Digital Identity Management systems. Blockchain is a powerful techniqu
... Show Moremajor goal of the next-generation wireless communication systems is the development of a reliable high-speed wireless communication system that supports high user mobility. They must focus on increasing the link throughput and the network capacity. In this paper a novel, spectral efficient system is proposed for generating and transmitting twodimensional (2-D) orthogonal frequency division multiplexing (OFDM) symbols through 2- D inter-symbol interference (ISI) channel. Instead of conventional data mapping techniques, discrete finite Radon transform (FRAT) is used as a data mapping technique due to the increased orthogonality offered. As a result, the proposed structure gives a significant improvement in bit error rate (BER) performance. Th
... Show MoreOrthogonal Frequency Division Multiplexing (OFDM) is an efficient multi-carrier technique.The core operation in the OFDM systems is the FFT/IFFT unit that requires a large amount of hardware resources and processing delay. The developments in implementation techniques likes Field Programmable Gate Array (FPGA) technologies have made OFDM a feasible option. The goal of this paper is to design and implement an OFDM transmitter based on Altera FPGA using Quartus software. The proposed transmitter is carried out to simplify the Fourier transform calculation by using decoder instead of multipliers. After programming ALTERA DE2 FPGA kit with implemented project, several practical tests have been done starting from monitoring all the results of
... Show MoreThe researchers of the present study have conducted a genre analysis of two political debates between American presidential nominees in the 2016 and 2020 elections. The current study seeks to analyze the cognitive construction of political debates to evaluate the typical moves and strategies politicians use to express their communicative intentions and to reveal the language manifestations of those moves and strategies. To achieve the study’s aims, the researchers adopt Bhatia’s (1993) framework of cognitive construction supported by van Emeren’s (2010) pragma-dialectic framework. The study demonstrates that both presidents adhere to this genre structuring to further their political agendas. For a positive and promising image
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.