Preferred Language
Articles
/
yhclP48BVTCNdQwCb2Uw
Intrusion Detection System Using Data Stream Classification
...Show More Authors

Secure data communication across networks is always threatened with intrusion and abuse. Network Intrusion Detection System (IDS) is a valuable tool for in-depth defense of computer networks. Most research and applications in the field of intrusion detection systems was built based on analysing the several datasets that contain the attacks types using the classification of batch learning machine. The present study presents the intrusion detection system based on Data Stream Classification. Several data stream algorithms were applied on CICIDS2017 datasets which contain several new types of attacks. The results were evaluated to choose the best algorithm that satisfies high accuracy and low computation time.

Scopus Crossref
Publication Date
Mon Oct 01 2018
Journal Name
International Journal Of Electrical And Computer Engineering
Load balance in data center SDN networks
...Show More Authors

In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and

... Show More
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Mar 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Using Quadratic Form Ratio Multiple Test to Estimate Linear Regression Model Parameters in Big Data with Application: Child Labor in Iraq
...Show More Authors

              The current paper proposes a new estimator for the linear regression model parameters under Big Data circumstances.  From the diversity of Big Data variables comes many challenges that  can be interesting to the  researchers who try their best to find new and novel methods to estimate the parameters of linear regression model. Data has been collected by Central Statistical Organization IRAQ, and the child labor in Iraq has been chosen as data. Child labor is the most vital phenomena that both society and education are suffering from and it affects the future of our next generation. Two methods have been selected to estimate the parameter

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Geological Journal
Multiple and Coherent Noise Removal from X-Profile 2D Seismic Data of Southern Iraq Using Normal Move Out-Frequency Wavenumber Technique
...Show More Authors

Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain

... Show More
View Publication
Scopus (2)
Scopus Crossref
Publication Date
Fri Sep 17 2021
Journal Name
Journal Of Petroleum Exploration And Production Technology
Characterization of flow units, rock and pore types for Mishrif Reservoir in West Qurna oilfield, Southern Iraq by using lithofacies data
...Show More Authors
Abstract<p>This study has been accomplished by testing three different models to determine rocks type, pore throat radius, and flow units for Mishrif Formation in West Qurna oilfield in Southern Iraq based on Mishrif full diameter cores from 20 wells. The three models that were used in this study were Lucia rocks type classification, Winland plot was utilized to determine the pore throat radius depending on the mercury injection test (r35), and (FZI) concepts to identify flow units which enabled us to recognize the differences between Mishrif units in these three categories. The study of pore characteristics is very significant in reservoir evaluation. It controls the storage mechanism and reservoir fluid prope</p> ... Show More
View Publication
Scopus (24)
Crossref (21)
Scopus Clarivate Crossref
Publication Date
Sun Mar 02 2008
Journal Name
Baghdad Science Journal
Tamper Detection in Color Image
...Show More Authors

In this work a fragile watermarking scheme is presented. This scheme is applied to digital color images in spatial domain. The image is divided into blocks, and each block has its authentication mark embedded in it, we would be able to insure which parts of the image are authentic and which parts have been modified. This authentication carries out without need to exist the original image. The results show the quality of the watermarked image is remaining very good and the watermark survived some type of unintended modification such as familiar compression software like WINRAR and ZIP

View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Journal Of Southwest Jiaotong University
Image Segmentation for Skin Detection
...Show More Authors

Human skin detection, which usually performed before image processing, is the method of discovering skin-colored pixels and regions that may be of human faces or limbs in videos or photos. Many computer vision approaches have been developed for skin detection. A skin detector usually transforms a given pixel into a suitable color space and then uses a skin classifier to mark the pixel as a skin or a non-skin pixel. A skin classifier explains the decision boundary of the class of a skin color in the color space based on skin-colored pixels. The purpose of this research is to build a skin detection system that will distinguish between skin and non-skin pixels in colored still pictures. This performed by introducing a metric that measu

... Show More
View Publication
Crossref (4)
Crossref
Publication Date
Sun Jun 01 2008
Journal Name
Baghdad Science Journal
Tamper Detection in Text Document
...Show More Authors

Although text document images authentication is difficult due to the binary nature and clear separation between the background and foreground but it is getting higher demand for many applications. Most previous researches in this field depend on insertion watermark in the document, the drawback in these techniques lie in the fact that changing pixel values in a binary document could introduce irregularities that are very visually noticeable. In this paper, a new method is proposed for object-based text document authentication, in which I propose a different approach where a text document is signed by shifting individual words slightly left or right from their original positions to make the center of gravity for each line fall in with the m

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 01 2025
Journal Name
Journal Of Computer Sciences And Informatics
Edge Detection Methods: A Review
...Show More Authors

This article studies a comprehensive methods of edge detection and algorithms in digital images which is reflected a basic process in the field of image processing and analysis. The purpose of edge detection technique is discovering the borders that distinct diverse areas of an image, which donates to refining the understanding of the image contents and extracting structural information. The article starts by clarifying the idea of an edge and its importance in image analysis and studying the most noticeable edge detection methods utilized in this field, (e.g. Sobel, Prewitt, and Canny filters), besides other schemes based on distinguishing unexpected modifications in light intensity and color gradation. The research as well discuss

... Show More
View Publication
Crossref
Publication Date
Thu Nov 17 2022
Journal Name
Journal Of Information And Optimization Sciences
Hybrid deep learning model for Arabic text classification based on mutual information
...Show More Authors

View Publication
Crossref (1)
Clarivate Crossref