Preferred Language
Articles
/
bsj-6142
Steganography and Cryptography Techniques Based Secure Data Transferring Through Public Network Channel
...Show More Authors

Attacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover.  The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels with two secret random keys. Therefore, the hidden message remains protected even if the stego-object is hacked because the attacker is unable to know the correct frames and pixels that hold each bit of the secret message in addition to difficulty to successfully rebuild the message. The results refer to that the proposed scheme provides a good performance for evaluation metric that is used in this purpose when compared to a large number of related previous methods.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Aug 01 2020
Journal Name
Journal Of Engineering Science And Technology (jestec)
Influence of A River Water Quality on The Efficiency of Water Treatment Using Artificial Neural Network
...Show More Authors

Tigris River is the lifeline that supplies a great part of Iraq with water from north to south. Throughout its entire length, the river is battered by various types of pollutants such as wastewater effluents from municipal, industrial, agricultural activities, and others. Hence, the water quality assessment of the Tigris River is crucial in ensuring that appropriate and adequate measures are taken to save the river from as much pollution as possible. In this study, six water treatment plants (WTPs) situated on the two-banks of the Tigris within Baghdad City were Al Karkh; Sharq Dijla; Al Wathba; Al Karama; Al Doura, and Al Wahda from northern Baghdad to its south, that selected to determine the removal efficiency of turbidity and

... Show More
Publication Date
Fri Jan 01 2021
Journal Name
Review Of International Geographical Education
Evaluating the performance of project management using network diagrams methods: A case study in the Ramadi Municipality
...Show More Authors

This study came for the reason that some project administrations still do not follow the appropriate scientific methods that enable them to perform their work in a manner that achieves the goals for which those projects arise, in addition to exceeding the planned times and costs, so this study aims to apply the methods of network diagrams in Planning, scheduling and monitoring the project of constructing an Alzeuot intersection bridge in the city of Ramadi, as the research sample, being one of the strategic projects that are being implemented in the city of Ramadi, as well as being one of the projects that faced during its implementation Several of problems, the project problem was studied according to scientific methods through the applica

... Show More
Scopus (1)
Scopus
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
Analytical Approach for Load Capacity of Large Diameter Bored Piles Using Field Data
...Show More Authors

An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.

               Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95

... Show More
View Publication Preview PDF
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight

... Show More
View Publication Preview PDF
Publication Date
Fri Mar 01 2019
Journal Name
Spatial Statistics
Efficient Bayesian modeling of large lattice data using spectral properties of Laplacian matrix
...Show More Authors

Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati

... Show More
View Publication
Scopus (9)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Sun May 11 2025
Journal Name
Iraqi Statisticians Journal
Estimating General Linear Regression Model of Big Data by Using Multiple Test Technique
...Show More Authors

View Publication
Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More