Preferred Language
Articles
/
bsj-5112
Advanced Intelligent Data Hiding Using Video Stego and Convolutional Neural Networks
...Show More Authors

Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file.  In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus, the exact way by which the network will hide the information is unable to be known to anyone who does not have the weights.  The second goal is to increase hiding capacity, which has been achieved by using CNN as a strategy to make decisions to determine the best areas that are redundant and, as a result, gain more size to be hidden. Furthermore, In the proposed model, CNN is concurrently trained to generate the revealing and hiding processes, and it is designed to work as a pair mainly. This model has a good strategy for the patterns of images, which assists to make decisions to determine which is the parts of the cover image should be redundant, as well as more pixels are hidden there. The CNN implementation can be done by using Keras, along with tensor flow backend. In addition, random RGB images from the "ImageNet dataset" have been used for training the proposed model (About 45000 images of size (256x256)). The proposed model has been trained by CNN using random images taken from the database of ImageNet and can work on images taken from a wide range of sources. By saving space on an image by removing redundant areas, the quantity of hidden data can be raised (improve capacity). Since the weights and model architecture are randomized, the actual method in which the network will hide the data can't be known to anyone who does not have the weights. Furthermore, additional block-shuffling is incorporated as an encryption method to improved security; also, the image enhancement methods are used to improving the output quality. From results, the proposed method has achieved high-security level, high embedding capacity. In addition, the result approves that the system achieves good results in visibility and attacks, in which the proposed method successfully tricks observer and the steganalysis program.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More
Publication Date
Sun May 11 2025
Journal Name
Iraqi Statisticians Journal
Estimating General Linear Regression Model of Big Data by Using Multiple Test Technique
...Show More Authors

View Publication
Crossref
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
Analytical Approach for Load Capacity of Large Diameter Bored Piles Using Field Data
...Show More Authors

An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.

               Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95

... Show More
View Publication Preview PDF
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Tue Jan 01 2019
Journal Name
Journal Of Southwest Jiaotong University
Recognizing Job Apathy Patterns of Iraqi Higher Education Employees Using Data Mining Techniques
...Show More Authors

Psychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF
Publication Date
Sun Apr 30 2023
Journal Name
Iraqi Geological Journal
Evaluating Machine Learning Techniques for Carbonate Formation Permeability Prediction Using Well Log Data
...Show More Authors

Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To

... Show More
View Publication
Scopus (15)
Crossref (6)
Scopus Crossref
Publication Date
Tue Jun 30 2020
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Using Artificial Neural Network to Predict Rate of Penetration from Dynamic Elastic Properties in Nasiriya Oil Field
...Show More Authors

   The time spent in drilling ahead is usually a significant portion of total well cost. Drilling is an expensive operation including the cost of equipment and material used during the penetration of rock plus crew efforts in order to finish the well without serious problems. Knowing the rate of penetration should help in speculation of the cost and lead to optimize drilling outgoings. Ten wells in the Nasiriya oil field have been selected based on the availability of the data. Dynamic elastic properties of Mishrif formation in the selected wells were determined by using Interactive Petrophysics (IP V3.5) software based on the las files and log record provided. The average rate of penetration and average dynamic elastic propert

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Baghdad Science Journal
Comparison between RSA and CAST-128 with Adaptive Key for Video Frames Encryption with Highest Average Entropy
...Show More Authors

Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (2)
Scopus Clarivate Crossref