Preferred Language
Articles
/
pxZOn4oBVTCNdQwCeaE9
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse detection techniques using two DM classifiers (Interactive Dichotomizer 3 (ID3) classifier and Naïve Bayesian (NB) Classifier) to verify the validity of the proposed system in term of accuracy rate. A proposed HybD dataset used in training and testing the hybrid IDS. Feature selection is used to consider the intrinsic features in classification decision, this accomplished by using three different measures: Association rules (AR) method, ReliefF measure, and Gain Ratio (GR) measure. NB classifier with AR method given the most accurate classification results (99%) with false positive (FP) rate (0%) and false negative (FN) rate (1%).

Publication Date
Sat Sep 27 2025
Journal Name
Journal Of Administration And Economics
Bayesian Method in Classification Regression Tree to estimate nonparametric additive model compared with Logistic Model with Application
...Show More Authors

The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode

... Show More
View Publication Preview PDF
Publication Date
Fri Mar 15 2024
Journal Name
Journal Of Baghdad College Of Dentistry
A clinicopathological analysis of 151 odontogenic tumors based on new WHO classification 2022: A retrospective cross-sectional study
...Show More Authors

Background: Odontogenic tumors are a diverse group of lesions with a variety of clinical behavior and histopathologic subtypes, from hamartomatous and benign to malignant. The study aimed to examine the clinical and pathological features of odontogenic tumors in Baghdad over the last 11 years (2011–2021). Materials and Methods: The present retrospective study analyzed all formalin-fixed, paraffin-embedded tissue blocks of patients diagnosed with an odontogenic tumor that were retrieved from archives at a teaching hospital/College of Dentistry in Baghdad University, Iraq, between 2011 and 2021. The diagnosis of each case was confirmed by examining the hematoxylin and eosin stained sections by two expert pathologists. Data from pati

... Show More
Scopus (2)
Crossref (3)
Scopus Crossref
Publication Date
Wed Aug 12 2020
Journal Name
International Journal On Advanced Science, Engineering And Information Technology
Developing of a 3D Printer to Produce Parts Using Powder Metal
...Show More Authors

View Publication
Scopus Crossref
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Computer aided photographic memory enhancement and speed reading (case study)
...Show More Authors

This work aimed to design and testing of a computer program – based eyeQ improvement, photographic memory enhancement, and speed reading to match the reading speed 150 – 250 word per minute (WPM) with the mind ability of processing and eye snap shooting 5000WPM . The package designed based on Visual Basic 6. The efficiency of the designed program was tested on a 10 persons with different levels of education and ages and the results show an increase in their reading speed of approximately 25% in the first month of training with noticeable enhancement in the memory as well as an increase in the ability to read for longer time without feeling nerves or boring, a nonlinear continuously increase in reading speed is assured after the first mo

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Mar 13 2017
Journal Name
Journal Of Baghdad College Of Dentistry
Computer Assisted Immunohistochemical Score Prediction Via Simplified Image Acquisition Technique
...Show More Authors

Background: techniques of image analysis have been used extensively to minimize interobserver variation of immunohistochemical scoring, yet; image acquisition procedures are often demanding, expensive and laborious. This study aims to assess the validity of image analysis to predict human observer’s score with a simplified image acquisition technique. Materials and methods: formalin fixed- paraffin embedded tissue sections for ameloblastomas and basal cell carcinomas were immunohistochemically stained with monoclonal antibodies to MMP-2 and MMP-9. The extent of antibody positivity was quantified using Imagej® based application on low power photomicrographs obtained with a conventional camera. Results of the software were employed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Sep 24 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin

... Show More
View Publication
Crossref
Publication Date
Sun Oct 01 2017
Journal Name
International Journal Of Computer Science And Information Security (ijcsis)
Finite State Automata Generator for DNA Motif Template as Preparation Step for Motif Mining
...Show More Authors

There are many tools and S/W systems to generate finite state automata, FSA, due to its importance in modeling and simulation and its wide variety of applications. However, no appropriate tool that can generate finite state automata, FSA, for DNA motif template due to the huge size of the motif template. In addition to the optional paths in the motif structure which are represented by the gap. These reasons lead to the unavailability of the specifications of the automata to be generated. This absence of specifications makes the generating process very difficult. This paper presents a novel algorithm to construct FSAs for DNA motif templates. This research is the first research presents the problem of generating FSAs for DNA motif temp

... Show More
Preview PDF
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jul 01 2020
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Fast and robust approach for data security in communication channel using pascal matrix
...Show More Authors

This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Crossref
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More