Preferred Language
Articles
/
bsj-5069
Compression-based Data Reduction Technique for IoT Sensor Networks
...Show More Authors

Energy savings are very common in IoT sensor networks because IoT sensor nodes operate with their own limited battery. The data transmission in the IoT sensor nodes is very costly and consume much of the energy while the energy usage for data processing is considerably lower. There are several energy-saving strategies and principles, mainly dedicated to reducing the transmission of data. Therefore, with minimizing data transfers in IoT sensor networks, can conserve a considerable amount of energy. In this research, a Compression-Based Data Reduction (CBDR) technique was suggested which works in the level of IoT sensor nodes. The CBDR includes two stages of compression, a lossy SAX Quantization stage which reduces the dynamic range of the sensor data readings, after which a lossless LZW compression to compress the loss quantization output. Quantizing the sensor node data readings down to the alphabet size of SAX results in lowering, to the advantage of the best compression sizes, which contributes to greater compression from the LZW end of things. Also, another improvement was suggested to the CBDR technique which is to add a Dynamic Transmission (DT-CBDR) to decrease both the total number of data sent to the gateway and the processing required. OMNeT++ simulator along with real sensory data gathered at Intel Lab is used to show the performance of the proposed technique. The simulation experiments illustrate that the proposed CBDR technique provides better performance than the other techniques in the literature.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Jul 29 2020
Journal Name
Iraqi Journal Of Science
Fractal Image Compression Using Block Indexing Technique: A Review
...Show More Authors

Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima

... Show More
Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Merge Operation Effect On Image Compression Using Fractal Technique
...Show More Authors

Fractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.

View Publication Preview PDF
Publication Date
Sat Dec 17 2022
Journal Name
Iraqi Journal Of Laser
PDF Biosensing technique for detection of H.pylori bacteria: Israa M.L. SaQari*, Layla M.H. Al-ameri
...Show More Authors

Abstract :H.pylori is an important cause of gastric duodenal disease, including gastric ulcers, Mucosa-associated lymphoid tissue (MALT), and gastric carcinoma. biosensors are becoming the most extensively studied discipline because the easy, rapid, low-cost, highly sensitive, and highly selective biosensors contribute to advances in next-generation medicines such as individualized medicine and ultrasensitive point-of-care detection of markers for diseases.  Five of ten patients diagnosed with H.pylori ranging in age from 15–85 participated in this research. who [gastritis, duodenitis, duodenal ulcer (DU), and peptic ulcer (PU)] Suspected H.pylori colonies w

... Show More
View Publication Preview PDF
Publication Date
Thu Jun 20 2019
Journal Name
Baghdad Science Journal
An Optimised Method for Fetching and Transforming Survey Data based on SQL and R Programming Language
...Show More Authors

The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste

... Show More
View Publication Preview PDF
Crossref (1)
Clarivate Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
Efficiency Measurement Model for Postgraduate Programs and Undergraduate Programs by Using Data Envelopment Analysis
...Show More Authors

Measuring the efficiency of postgraduate and undergraduate programs is one of the essential elements in educational process. In this study, colleges of Baghdad University and data for the academic year (2011-2012) have been chosen to measure the relative efficiencies of postgraduate and undergraduate programs in terms of their inputs and outputs. A relevant method to conduct the analysis of this data is Data Envelopment Analysis (DEA). The effect of academic staff to the number of enrolled and alumni students to the postgraduate and undergraduate programs are the main focus of the study.

 

View Publication Preview PDF
Crossref
Publication Date
Tue May 01 2018
Journal Name
Sci.int.(lahore)
The Effect of Wavelet Coefficient Reduction on Image Compression Using DWT and Daubechies Wavelet Transform
...Show More Authors

FG Mohammed, HM Al-Dabbas, Science International, 2018 - Cited by 2

View Publication
Publication Date
Mon Dec 20 2021
Journal Name
Baghdad Science Journal
An Experimental Study of the Server-based Unfairness Solutions for the Cross-Protocol Scenario of Adaptive Streaming over HTTP/3 and HTTP/2
...Show More Authors

Since the introduction of the HTTP/3, research has focused on evaluating its influences on the existing adaptive streaming over HTTP (HAS). Among these research, due to irrelevant transport protocols, the cross-protocol unfairness between the HAS over HTTP/3 (HAS/3) and HAS over HTTP/2 (HAS/2) has caught considerable attention. It has been found that the HAS/3 clients tend to request higher bitrates than the HAS/2 clients because the transport QUIC obtains higher bandwidth for its HAS/3 clients than the TCP for its HAS/2 clients. As the problem originates from the transport layer, it is likely that the server-based unfairness solutions can help the clients overcome such a problem. Therefore, in this paper, an experimental study of the se

... Show More
View Publication Preview PDF
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2013
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Multi-pages structure for Six Sigma performance matrix based on the technical characteristics of balanced performance and methodology Al-Sigma for measuring corporate performance: Suggestion vision
...Show More Authors

Accelerates operating managements in the facilities contemporary business environment toward redefining processes and strategies that you need to perform tasks of guaranteeing them continue in an environment performance dominated by economic globalization and the circumstances of uncertainty attempt the creation of a new structure through multiple pages seek to improve profitability and sustainable growth in performance in a climatefocuses on the development of institutional processes, reduce costs and achieve customer satisfaction to meet their demands and expectations are constantly changing. The research was presented structural matrix performance combines methodology Alsigma in order to improve customer satisfaction significantly bet

... Show More
View Publication Preview PDF
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Compare to the conditional logistic regression models with fixed and mixed effects for longitudinal data
...Show More Authors

Mixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab

... Show More
View Publication Preview PDF
Crossref