Preferred Language
Articles
/
nRc9Po4BVTCNdQwCND5r
Art Image Compression Based on Lossless LZW Hashing Ciphering Algorithm
...Show More Authors
Abstract<p>Color image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and quantitative experimental results on technical color images show that the proposed methodology gives reconstructed images with a high PSNR value compared to standard image compression techniques.</p>
Scopus Crossref
View Publication
Publication Date
Sun Jan 14 2018
Journal Name
Journal Of Engineering
Determination Optimum Inventory Level for Material Using Genetic Algorithm
...Show More Authors

The integration of decision-making will lead to the robust of its decisions, and then determination optimum inventory level to the required materials to produce and reduce the total cost by the cooperation of purchasing department with inventory department and also with other company,s departments. Two models are suggested to determine Optimum Inventory Level (OIL), the first model (OIL-model 1) assumed that the inventory level for materials quantities equal to the required materials, while the second model (OIL-model 2) assumed that the inventory level for materials quantities more than the required materials for the next period.             &nb

... Show More
View Publication Preview PDF
Publication Date
Fri Oct 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Estimate The Survival Function By Using The Genetic Algorithm
...Show More Authors

  Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Mar 01 2008
Journal Name
Iraqi Journal Of Physics
Enforcing Wiener Filter in the Iterative Blind Restoration Algorithm
...Show More Authors

A new blind restoration algorithm is presented and shows high quality restoration. This
is done by enforcing Wiener filtering approach in the Fourier domains of the image and the
psf environments

View Publication Preview PDF
Publication Date
Sat Jul 31 2021
Journal Name
Iraqi Journal Of Science
A Decision Tree-Aware Genetic Algorithm for Botnet Detection
...Show More Authors

     In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets  namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from

... Show More
Scopus (6)
Crossref (2)
Scopus Crossref
Publication Date
Sun Sep 01 2019
Journal Name
Baghdad Science Journal
PWRR Algorithm for Video Streaming Process Using Fog Computing
...Show More Authors

       The most popular medium that being used by people on the internet nowadays is video streaming.  Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Mon Dec 05 2022
Journal Name
Baghdad Science Journal
Cloud Data Security through BB84 Protocol and Genetic Algorithm
...Show More Authors

In the current digitalized world, cloud computing becomes a feasible solution for the virtualization of cloud computing resources.  Though cloud computing has many advantages to outsourcing an organization’s information, but the strong security is the main aspect of cloud computing. Identity authentication theft becomes a vital part of the protection of cloud computing data. In this process, the intruders violate the security protocols and perform attacks on the organizations or user’s data. The situation of cloud data disclosure leads to the cloud user feeling insecure while using the cloud platform. The different traditional cryptographic techniques are not able to stop such kinds of attacks. BB84 protocol is the first quantum cry

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (4)
Scopus Crossref
Publication Date
Wed Nov 20 2019
Journal Name
Proceedings Of The 2019 3rd International Conference On Big Data Research
Pressure Vessel Design Simulation Using Hybrid Harmony Search Algorithm
...Show More Authors

View Publication
Scopus (2)
Scopus Crossref
Publication Date
Tue Oct 01 2013
Journal Name
2013 Ieee International Conference On Systems, Man, And Cybernetics
AWSS: An Algorithm for Measuring Arabic Word Semantic Similarity
...Show More Authors

View Publication
Scopus (24)
Crossref (12)
Scopus Clarivate Crossref
Publication Date
Sun Jan 01 2012
Journal Name
International Journal Of Cyber-security And Digital Forensics (ijcsdf)
Genetic Algorithm Approach for Risk Reduction of Information Security
...Show More Authors

Nowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef

... Show More
Publication Date
Thu Feb 25 2016
Journal Name
Research Journal Of Applied Sciences, Engineering And Technology
Block Matching Algorithm Using Mean and Low Order Moments
...Show More Authors

In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c

... Show More
View Publication Preview PDF
Crossref