Preferred Language
Articles
/
5RYK5IsBVTCNdQwCbuPB
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve Bayesian classifier (NBC) have been enhanced as compared to the dataset before applying the proposed method. Moreover, the results indicated that issa was performed better than the statistical imputation techniques such as deleting the samples with missing values, replacing the missing values with zeros, mean, or random values.

Scopus Crossref
View Publication
Publication Date
Mon Oct 24 2022
Journal Name
Energies
Double-Slope Solar Still Productivity Based on the Number of Rubber Scraper Motions
...Show More Authors

In low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is

... Show More
View Publication
Scopus (5)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Sep 04 2016
Journal Name
Baghdad Science Journal
The Importance and Interaction Indices of Bi-Capacities Based on Ternary-Element Sets
...Show More Authors

Grabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.

View Publication Preview PDF
Crossref
Publication Date
Sun Apr 30 2023
Journal Name
Ingénierie Des Systèmes D Information
Performance Evaluation of Multi-Organization E-Government Based on Hyperledger Fabric Blockchain Platform
...Show More Authors

View Publication
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Fri Mar 01 2019
Journal Name
Al-khwarizmi Engineering Journal
Improve Akaike’s Information Criterion Estimation Based on Denoising of Quadrature Mirror Filter Bank
...Show More Authors

Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).

View Publication Preview PDF
Publication Date
Sun Nov 19 2017
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com

... Show More
View Publication
Crossref (3)
Crossref
Publication Date
Mon Feb 01 2016
Journal Name
Journal Of Engineering
Valuation of Construction Projects Based on of Quantity Scale by using Expert System
...Show More Authors

The subject of an valuation of quality of construction projects is one of the topics which it becomes necessary of the absence of the quantity standards in measuring the control works and the quality valuation standards in constructional projects. In the time being it depends on the experience of the workers which leads to an apparent differences in the valuation.

The idea of this research came to put the standards to evaluate the quality of the projects in a special system depending on quantity scale nor quality specifying in order to prepare an expert system “ Crystal “ to apply this special system to able the engineers to valuate the quality of their projects easily and in more accurate ways.

View Publication Preview PDF
Publication Date
Tue Aug 10 2021
Journal Name
Design Engineering
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye

... Show More
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Synthesis, Characterization and Gas Sensor Application of New Composite Based on MWCNTs:CoPc:Metal Oxide
...Show More Authors

The synthesis of new substituted cobalt Phthalocyanine (CoPc) was carried out using starting materials Naphthalene-1,4,5, tetracarbonic acid dianhydride (NDI) employing dry process method. Metal oxides (MO) alloy of (60%Ni3O4 40%-Co3O4 ) have been functionalized with multiwall carbon nanotubes (F-MWCNTs) to produce (F-MWCNTs/MO) nanocomposite (E2) and mixed with  CoPc to yield (F-MWCNT/CoPc/MO) (E3). These composites were investigated using different analytical and spectrophotometric methods such as 1H-NMR (0-18 ppm), FTIR spectroscopy in the range of (400-4000cm-1), powder X-rays diffraction (PXRD, 2θ o = 10-80), Raman spectroscopy (0-4000 cm-1), and UV-Visib

... Show More
View Publication Preview PDF
Scopus (16)
Crossref (13)
Scopus Clarivate Crossref
Publication Date
Fri May 04 2018
Journal Name
Wireless Personal Communications
IFRS: An Indexed Face Recognition System Based on Face Recognition and RFID Technologies
...Show More Authors

View Publication
Scopus (10)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
An exploratory study of history-based test case prioritization techniques on different datasets
...Show More Authors

In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Crossref