Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Sat Oct 01 2022
Journal Name
Baghdad Science Journal
A Crime Data Analysis of Prediction Based on Classification Approaches
...Show More Authors

Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin

... Show More
View Publication Preview PDF
Scopus (10)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Wed Jan 01 2020
Journal Name
International Journal Of Computing
Twitter Location-Based Data: Evaluating the Methods of Data Collection Provided by Twitter Api
...Show More Authors

Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat

... Show More
View Publication
Scopus (4)
Crossref (1)
Scopus Crossref
Publication Date
Sat Nov 05 2016
Journal Name
Research Journal Of Applied Sciences, Engineering And Technology
Image Compression Based on Cubic Bezier Interpolation, Wavelet Transform, Polynomial Approximation, Quadtree Coding and High Order Shift Encoding
...Show More Authors

In this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t

... Show More
View Publication
Crossref (2)
Crossref
Publication Date
Wed Jul 25 2018
Journal Name
International Journal Of Engineering Trends And Technology
Fixed Predictor Polynomial Coding for Image Compression
...Show More Authors

View Publication
Crossref (1)
Crossref
Publication Date
Thu Jan 01 2015
Journal Name
International Journal Of Computer Science And Mobile Computing
Image Compression using Hierarchal Linear Polynomial Coding
...Show More Authors

Publication Date
Sun Sep 03 2023
Journal Name
Iraqi Journal Of Computers, Communications, Control & Systems Engineering (ijccce)
Efficient Iris Image Recognition System Based on Machine Learning Approach
...Show More Authors

HM Al-Dabbas, RA Azeez, AE Ali, IRAQI JOURNAL OF COMPUTERS, COMMUNICATIONS, CONTROL AND SYSTEMS ENGINEERING, 2023

View Publication
Publication Date
Wed Jun 01 2022
Journal Name
V. International Scientific Congress Of Pure, Applied And Technological Sciences
Lightweight Image Compression Using Polynomial and Transform Coding
...Show More Authors

Publication Date
Wed Jul 06 2022
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression using Polynomial Coding Techniques: A review
...Show More Authors

Publication Date
Tue Sep 27 2022
Journal Name
Journal Of Engineering Research And Sciences
Images Compression using Combined Scheme of Transform Coding
...Show More Authors

Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jul 01 2016
Journal Name
International Journal Of Computer Science And Mobile Computing
. Interpolative Absolute Block Truncation Coding for Image Compression
...Show More Authors