Preferred Language
Articles
/
bsj-5112
Advanced Intelligent Data Hiding Using Video Stego and Convolutional Neural Networks
...Show More Authors

Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file.  In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus, the exact way by which the network will hide the information is unable to be known to anyone who does not have the weights.  The second goal is to increase hiding capacity, which has been achieved by using CNN as a strategy to make decisions to determine the best areas that are redundant and, as a result, gain more size to be hidden. Furthermore, In the proposed model, CNN is concurrently trained to generate the revealing and hiding processes, and it is designed to work as a pair mainly. This model has a good strategy for the patterns of images, which assists to make decisions to determine which is the parts of the cover image should be redundant, as well as more pixels are hidden there. The CNN implementation can be done by using Keras, along with tensor flow backend. In addition, random RGB images from the "ImageNet dataset" have been used for training the proposed model (About 45000 images of size (256x256)). The proposed model has been trained by CNN using random images taken from the database of ImageNet and can work on images taken from a wide range of sources. By saving space on an image by removing redundant areas, the quantity of hidden data can be raised (improve capacity). Since the weights and model architecture are randomized, the actual method in which the network will hide the data can't be known to anyone who does not have the weights. Furthermore, additional block-shuffling is incorporated as an encryption method to improved security; also, the image enhancement methods are used to improving the output quality. From results, the proposed method has achieved high-security level, high embedding capacity. In addition, the result approves that the system achieves good results in visibility and attacks, in which the proposed method successfully tricks observer and the steganalysis program.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 15 2022
Journal Name
Journal Of Petroleum Research And Studies
Selection of an Optimum Drilling Fluid Model to Enhance Mud Hydraulic System Using Neural Networks in Iraqi Oil Field
...Show More Authors

In drilling processes, the rheological properties pointed to the nature of the run-off and the composition of the drilling mud. Drilling mud performance can be assessed for solving the problems of the hole cleaning, fluid management, and hydraulics controls. The rheology factors are typically termed through the following parameters: Yield Point (Yp) and Plastic Viscosity (μp). The relation of (YP/ μp) is used for measuring of levelling for flow. High YP/ μp percentages are responsible for well cuttings transportation through laminar flow. The adequate values of (YP/ μp) are between 0 to 1 for the rheological models which used in drilling. This is what appeared in most of the models that were used in this study. The pressure loss

... Show More
View Publication
Crossref (2)
Crossref
Publication Date
Mon Nov 11 2019
Journal Name
Spe
Modeling Rate of Penetration using Artificial Intelligent System and Multiple Regression Analysis
...Show More Authors
Abstract<p>Over the years, the prediction of penetration rate (ROP) has played a key rule for drilling engineers due it is effect on the optimization of various parameters that related to substantial cost saving. Many researchers have continually worked to optimize penetration rate. A major issue with most published studies is that there is no simple model currently available to guarantee the ROP prediction.</p><p>The main objective of this study is to further improve ROP prediction using two predictive methods, multiple regression analysis (MRA) and artificial neural networks (ANNs). A field case in SE Iraq was conducted to predict the ROP from a large number of parame</p> ... Show More
View Publication Preview PDF
Crossref (7)
Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
User (K-Means) for clustering in Data Mining with application
...Show More Authors

 

 

  The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.

      And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Mar 26 2022
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
The Role of Big Data applications in forecasting corporate bankruptcy: Field analysis in the Saudi Business Environment
...Show More Authors

This study aimed to investigate the role of Big Data in forecasting corporate bankruptcy and that is through a field analysis in the Saudi business environment, to test that relationship. The study found: that Big Data is a recently used variable in the business context and has multiple accounting effects and benefits. Among the benefits is forecasting and disclosing corporate financial failures and bankruptcies, which is based on three main elements for reporting and disclosing that, these elements are the firms’ internal control system, the external auditing, and financial analysts' forecasts. The study recommends: Since the greatest risk of Big Data is the slow adaptation of accountants and auditors to these technologies, wh

... Show More
View Publication Preview PDF
Publication Date
Wed Jan 01 2020
Journal Name
International Journal Of Computational Intelligence Systems
Evolutionary Feature Optimization for Plant Leaf Disease Detection by Deep Neural Networks
...Show More Authors

View Publication
Scopus (49)
Crossref (46)
Scopus Clarivate Crossref
Publication Date
Mon Dec 20 2021
Journal Name
Baghdad Science Journal
Recurrent Stroke Prediction using Machine Learning Algorithms with Clinical Public Datasets: An Empirical Performance Evaluation
...Show More Authors

Recurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning al

... Show More
View Publication Preview PDF
Scopus (12)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Thu Aug 01 2019
Journal Name
The Journal Of Solid Waste Technology And Management
Recycling of Waste Compact Discs in Concrete Mix: Lab Investigations and Artificial Neural Networks Modeling
...Show More Authors

This study aimed to investigate the incorporation of recycled waste compact discs (WCDs) powder in concrete mixes to replace the fine aggregate by 5%, 10%, 15% and 20%. Compared to the reference concrete mix, results revealed that using WCDs powder in concrete mixes improved the workability and the dry density. The results demonstrated that the compressive, flexural, and split tensile strengths values for the WCDs-modified concrete mixes showed tendency to increase above the reference mix. However, at 28 days curing age, the strengths values for WCDs-modified concrete mixes were comparable to those for the reference mix. The leaching test revealed that none of the WCDs constituents was detected in the leachant after 180 days. The

... Show More
View Publication
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Sun Apr 01 2018
Journal Name
International Journal Of Electrical And Computer Engineering (ijece)
Information Hiding using LSB Technique based on Developed PSO Algorithm
...Show More Authors

<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most effi

... Show More
View Publication
Scopus (17)
Crossref (4)
Scopus Crossref