In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
Abstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThis study investigated the ability of using crushed glass solid wastes in water filtration by using a pilot plant, constructed in Al-Wathba water treatment plant in Baghdad. Different depths and different grain sizes of crushed glass were used as mono and dual media with sand and porcelaniate in the filtration process. The mathematical model by Tufenkji and Elimelech was used to evaluate the initial collection efficiency η of these filters. The results indicated that the collection efficiency varied inversely with the filtration rate. For the mono media filters the theoretical ηth values were more than the practical values ηprac calculated from the experimental work. In the glass filter ηprac was obtained by multiplying ηth by a facto
... Show MoreThis study investigated the ability of using crushed glass solid wastes in water filtration by using a pilot plant, constructed in Al-Wathba water treatment plant in Baghdad. Different depths and different grain sizes of crushed glass were used as mono and dual media with sand and porcelaniate in the filtration process. The mathematical model by Tufenkji and Elimelech was used to evaluate the initial collection efficiency η of these filters. The results indicated that the collection efficiency varied inversely with the filtration rate. For the mono media filters the theoretical ηth values were more than the practical values ηprac calculated from
the experimental work. In the glass filter ηprac was obtained by multiplying ηth by a
Today, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show MoreIn high-dimensional semiparametric regression, balancing accuracy and interpretability often requires combining dimension reduction with variable selection. This study intro- duces two novel methods for dimension reduction in additive partial linear models: (i) minimum average variance estimation (MAVE) combined with the adaptive least abso- lute shrinkage and selection operator (MAVE-ALASSO) and (ii) MAVE with smoothly clipped absolute deviation (MAVE-SCAD). These methods leverage the flexibility of MAVE for sufficient dimension reduction while incorporating adaptive penalties to en- sure sparse and interpretable models. The performance of both methods is evaluated through simulations using the mean squared error and variable selection cri
... Show MoreSome degree of noise is always present in any electronic device that
transmits or receives a signal . For televisions, this signal i has been to s the
broadcast data transmitted over cable-or received at the antenna; for digital
cameras, the signal is the light which hits the camera sensor. At any case, noise
is unavoidable. In this paper, an electronic noise has been generate on
TV-satellite images by using variable resistors connected to the transmitting cable
. The contrast of edges has been determined. This method has been applied by
capturing images from TV-satellite images (Al-arabiya channel) channel with
different resistors. The results show that when increasing resistance always
produced higher noise f
Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show More