In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
The ground state charge, neutron, proton and matter densities, the associated nuclear radii and the binding energy per nucleon of 8B, 17Ne, 23Al and 27P halo nuclei have been investigated using the Skyrme–Hartree–Fock (SHF) model with the new SKxs25 parameters. According to the calculated results, it is found that the SHF model with these Skyrme parameters provides a good description on the nuclear structure of above proton-rich halo nuclei. The elastic charge form factors of 8B and 17Ne halo nuclei and those of their stable isotopes 10B and 20Ne are calculated using plane-wave Born approximation with the charge density distributions obtained by SHF model to investigate the effect of the extended charge distributions of proton-rich nucl
... Show MoreThe existing study aimed to assess four soil moisture sensors’ capacitive (WH51 and SKU: S EN0193) and resistive (Yl69 and IC Station) abilities, which are affordable and medium-priced for their accuracy in six common soil types in the central region of Iraq. The readings’ calibration for the soil moisture sensor devices continued through two gravimetric methods. The first depended on the protocols’ database, while the second was the traditional calibration method. The second method recorded the lowest analysis error compared with the first. The moderate-cost sensor WH51 showed the lowest standard error (SE), MAD , and RMSE and the highest R² in both methods. The performance accuracy of WH51 was close to readings shown by the manufac
... Show More|
The research problem lies in the lack of accurate scientific perceptions concerning the reality of the communicator and the factors influencing his job. The research is aimed at introducing the communicator in the university press, clarifying the obstacles facing him, and uncovering the level of his job satisfaction and his visions of developing his work. The researcher adopted the survey method in the collection, analysis, and interpretation of the data using a questionnaire. A set of results and conclusions has been reached, most importantly are:
*The communicator performs multiple missions including writing, editing, and collecting info |
Abstract
The research aimed to test the relationship between the size of investment allocations in the agricultural sector in Iraq and their determinants using the Ordinary Least Squares (OLS) method compared to the Error Correction Model (ECM) approach. The time series data for the period from 1990 to 2021 was utilized. The analysis showed that the estimates obtained using the ECM were more accurate and significant than those obtained using the OLS method. Johansen's test indicated the presence of a long-term equilibrium relationship between the size of investment allocations and their determinants. The results of th
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreThe expansion in water projects implementations in Turkey and Syria becomes of great concern to the workers in the field of water resources management in Iraq. Such expansion with the absence of bi-lateral agreement between the three riparian countries of Tigris and Euphrates Rivers; Turkey, Syria and Iraq, is expected to lead to a substantially reduction of water inflow to the territories of Iraq. Accordingly, this study consists of two parts: first part is aiming to study the changes of the water inflow to the territory of Iraq, at Turkey and Syria borders, from 1953 to 2009; the results indicated that the annual mean inflow in Tigris River was decreased from 677 m3/sec to 526 m3/sec, after operating Turkey reserv
... Show MoreSince the beginning of the last century, the competition for water resources has intensified dramatically, especially between countries that have no agreements in place for water resources that they share. Such is the situation with the Euphrates River which flows through three countries (Turkey, Syria, and Iraq) and represents the main water resource for these countries. Therefore, the comprehensive hydrologic investigation needed to derive optimal operations requires reliable forecasts. This study aims to analysis and create a forecasting model for data generation from Turkey perspective by using the recorded inflow data of Ataturk reservoir for the period (Oct. 1961 - Sep. 2009). Based on 49 years of real inflow data
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show More