Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Fri Aug 05 2016
Journal Name
Wireless Communications And Mobile Computing
A comparison study on node clustering techniques used in target tracking WSNs for efficient data aggregation
...Show More Authors

Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati

... Show More
View Publication
Scopus (31)
Crossref (24)
Scopus Clarivate Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
A Study on Transportation Models in Their Minimum and Maximum Values with Applications of Real Data
...Show More Authors

The purpose of this paper is to apply different transportation models in their minimum and maximum values by finding starting basic feasible solution and finding the optimal solution. The requirements of transportation models were presented with one of their applications in the case of minimizing the objective function, which was conducted by the researcher as real data, which took place one month in 2015, in one of the poultry farms for the production of eggs

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (754)
Crossref (742)
Scopus Clarivate Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
A Comparison Between the Theoretical Cross Section Based on the Partial Level Density Formulae Calculated by the Exciton Model with the Experimental Data for (_79^197)Au nucleus
...Show More Authors

In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction  at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted  in the theoretical cross section and compared with the experimental data for  nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when  doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with  the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sat Jun 01 2024
Journal Name
International Journal Of Advanced And Applied Sciences
High-accuracy models for iris recognition with merging features
...Show More Authors

Due to advancements in computer science and technology, impersonation has become more common. Today, biometrics technology is widely used in various aspects of people's lives. Iris recognition, known for its high accuracy and speed, is a significant and challenging field of study. As a result, iris recognition technology and biometric systems are utilized for security in numerous applications, including human-computer interaction and surveillance systems. It is crucial to develop advanced models to combat impersonation crimes. This study proposes sophisticated artificial intelligence models with high accuracy and speed to eliminate these crimes. The models use linear discriminant analysis (LDA) for feature extraction and mutual info

... Show More
View Publication
Scopus (4)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Mon May 01 2023
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Comparison hybrid techniques-based mixed transform using compression and quality metrics
...Show More Authors

Image quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel

... Show More
View Publication
Scopus (4)
Scopus Crossref
Publication Date
Fri Mar 01 2019
Journal Name
Al-khwarizmi Engineering Journal
Multiwavelet and Estimation by Interpolation Analysis Based Hybrid Color Image Compression
...Show More Authors

Nowadays, still images are used everywhere in the digital world. The shortages of storage capacity and transmission bandwidth make efficient compression solutions essential. A revolutionary mathematics tool, wavelet transform, has already shown its power in image processing. The major topic of this paper, is improve the compresses of still images by Multiwavelet based on estimation the high Multiwavelet coefficients in high frequencies sub band  by interpolation instead of sending all Multiwavelet coefficients. When comparing the proposed approach with other compression methods Good result obtained

View Publication Preview PDF
Publication Date
Wed Jan 01 2014
Journal Name
Proceedings Of The Aintec 2014 On Asian Internet Engineering Conference - Aintec '14
LTE Peak Data Rate Estimation Using Modified alpha-Shannon Capacity Formula
...Show More Authors

View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Tue Mar 01 2022
Journal Name
International Journal Of Nonlinear Analysis And Applications
The suggested threshold to reduce data noise for a factorial experiment
...Show More Authors

In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b

... Show More
Publication Date
Sun Jan 06 2013
Journal Name
Chemical And Process Engineering Research
Milled Iraqi phoenix dactylifera data palm pruning woods lignin qualitative determination
...Show More Authors