Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Sun Oct 01 2023
Journal Name
Int. J. Nonlinear Anal. Appl
Adaptive 1-D polynomial coding to compress color image with C421
...Show More Authors

Publication Date
Wed Jan 01 2025
Journal Name
Current Neuropharmacology
Ischemic Stroke and Autophagy: The Roles of Long Non-Coding RNAs
...Show More Authors
:

Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating auto

... Show More
View Publication
Scopus (8)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Use of information and communications technology to archive data: A suggested form in the Tax Audit and Examination Department of the General Tax Authority
...Show More Authors

The current world is observing huge developments in presenting the opportunity for organizations and administrative units to use information and communication technology and their adoption by administrative work due to its importance in the achievement of work with higher efficiency, speed, and facility of communication with all individuals and companies using various means of communication Depending on the Internet networks. Therefore, the research dealt with the study of electronic systems designed and adopted in the creation or construction of a database for archiving data, which is the main method in organizations and administrative units in developed countries. Where this system works to convert documents, and manual processes and t

... Show More
View Publication Preview PDF
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of estimations methods of the entropy function to the random coefficients for two models: the general regression and swamy of the panel data
...Show More Authors

In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.

The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Oct 19 2018
Journal Name
Journal Of Economics And Administrative Sciences
Big Data Approch to Enhance Organizational Ambidexterity An Exploratory Study of a Sample of Managers at ASIA Cell For Mobile Telecommunication Company in Iraq
...Show More Authors

               The research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell  Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.

The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After  a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Sat May 30 2020
Journal Name
Neuroquantology Journal
The Effect of Re-Use of Lossy JPEG Compression Algorithm on the Quality of Satellite Image
...Show More Authors

In this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.

View Publication Preview PDF
Scopus (4)
Crossref (4)
Scopus Crossref
Publication Date
Mon Sep 29 2025
Journal Name
Journal Of Engineering
Image Compression Using 3-D Two-Level Techniques
...Show More Authors

In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-

... Show More
View Publication
Publication Date
Thu Feb 07 2019
Journal Name
Journal Of The College Of Education For Women
EFFICIENCY SPIHT IN COMPRESSION AND QUALITY OF IMAGE
...Show More Authors

Image compression is an important tool to reduce the bandwidth and storage
requirements of practical image systems. To reduce the increasing demand of storage
space and transmission time compression techniques are the need of the day. Discrete
time wavelet transforms based image codec using Set Partitioning In Hierarchical
Trees (SPIHT) is implemented in this paper. Mean Square Error (MSE), Peak Signal
to Noise Ratio (PSNR) and Maximum Difference (MD) are used to measure the
picture quality of reconstructed image. MSE and PSNR are the most common picture
quality measures. Different kinds of test images are assessed in this work with
different compression ratios. The results show the high efficiency of SPIHT algori

... Show More
View Publication Preview PDF
Publication Date
Sun Feb 24 2019
Journal Name
Iraqi Journal Of Physics
Adaptive inter frame compression using image segmented technique
...Show More Authors

The computer vision branch of the artificial intelligence field is concerned with developing algorithms for analyzing video image content. Extracting edge information, which is the essential process in most pictorial pattern recognition problems. A new method of edge detection technique has been introduces in this research, for detecting boundaries.

           Selection of typical lossy techniques for encoding edge video images are also discussed in this research. The concentration is devoted to discuss the Block-Truncation coding technique and Discrete Cosine Transform (DCT) coding technique. In order to reduce the volume of pictorial data which one may need to store or transmit,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Jan 01 2011
Journal Name
Trends In Network And Communications
Header Compression Scheme over Hybrid Satellite-WiMAX Network
...Show More Authors

View Publication
Scopus (1)
Scopus Crossref