Preferred Language
Articles
/
5RYK5IsBVTCNdQwCbuPB
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve Bayesian classifier (NBC) have been enhanced as compared to the dataset before applying the proposed method. Moreover, the results indicated that issa was performed better than the statistical imputation techniques such as deleting the samples with missing values, replacing the missing values with zeros, mean, or random values.

Scopus Crossref
View Publication
Publication Date
Thu Dec 01 2016
Journal Name
Journal Of Engineering
A Hybrid Coefficient Decimation- Interpolation Based Reconfigurable Low Complexity Filter Bank for Cognitive Radio
...Show More Authors

Non uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at

... Show More
View Publication Preview PDF
Publication Date
Wed Jan 30 2013
Journal Name
Al-kindy College Medical Journal
Chronic Kidny Disease and Risk of coronary Artery Disease,Aprospective study
...Show More Authors

AbstractBackground:Reduced glomeular filtration rate isassociated with increasedmorbidity in patientswith coronary arterydisease.Objectives :To analyze the declining eGFR andmortality risks in a patients with Chronic KidneyDisease and have had Coronary Artery Diseaseincluding risk factors .Patientsand Methods:The study included (160)patientsbetween the ages of 16 and 87years.Glomerular filtration rate was estimated (eGFR)using the Modification of Diet in Renal Diseaseequationand was categorized in the ranges<60 mL· min−1 per 1.73 m2and≥ 60 ml/min/1.73 m2.Baseline risk factors were analyzed by category ofeGFR,.The studied patients in emergencydepartment, were investigatedusing Coxproportional hazard models adjusting for traditiona

... Show More
View Publication Preview PDF
Publication Date
Sun Feb 28 2021
Journal Name
Journal Of Economics And Administrative Sciences
Using jack knife to estimation logistic regression model for Breast cancer disease
...Show More Authors

 

It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values  (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Feb 28 2021
Journal Name
Journal Of Economics And Administrative Sciences
Using jack knife to estimation logistic regression model for Breast cancer disease
...Show More Authors

 

It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values  (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 01 2020
Journal Name
International Journal Of Computational Intelligence Systems
Evolutionary Feature Optimization for Plant Leaf Disease Detection by Deep Neural Networks
...Show More Authors

View Publication
Scopus (55)
Crossref (53)
Scopus Clarivate Crossref
Publication Date
Sun Oct 01 2017
Journal Name
Journal Of The Faculty Of Medicine Baghdad
Pre-operative serum TSH level estimation for predicting malignant nodular thyroid disease
...Show More Authors

Publication Date
Sat Aug 01 2015
Journal Name
2015 Ieee Conference On Computational Intelligence In Bioinformatics And Computational Biology (cibcb)
Granular computing approach for the design of medical data classification systems
...Show More Authors

View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Tue Nov 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Implementation of new Secure Mechanism for Data Deduplication in Hybrid Cloud
...Show More Authors

Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of

... Show More
View Publication Preview PDF
Publication Date
Fri Apr 26 2019
Journal Name
Journal Of Contemporary Medical Sciences
Breast Cancer Decisive Parameters for Iraqi Women via Data Mining Techniques
...Show More Authors

Objective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Wed Jun 01 2022
Journal Name
Bulletin Of Electrical Engineering And Informatics
Proposed model for data protection in information systems of government institutions
...Show More Authors

Information systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Crossref