Preferred Language
Articles
/
joe-2261
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Jun 01 2007
Journal Name
Al-khwarizmi Engineering Journal
Reduction of the error in the hardware neural network
...Show More Authors

Specialized hardware implementations of Artificial Neural Networks (ANNs) can offer faster execution than general-purpose microprocessors by taking advantage of reusable modules, parallel processes and specialized computational components. Modern high-density Field Programmable Gate Arrays (FPGAs) offer the required flexibility and fast design-to-implementation time with the possibility of exploiting highly parallel computations like those required by ANNs in hardware. The bounded width of the data in FPGA ANNs will add an additional error to the result of the output. This paper derives the equations of the additional error value that generate from bounded width of the data and proposed a method to reduce the effect of the error to give

... Show More
View Publication Preview PDF
Publication Date
Mon Jun 19 2023
Journal Name
Journal Of Engineering
A Multi-variables Multi -sites Model for Forecasting Hydrological Data Series
...Show More Authors

A multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Mar 15 2020
Journal Name
Journal Of The College Of Education For Women
Data-Driven Approach for Teaching Arabic as a Foreign Language: Eygpt
...Show More Authors

Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language.  In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago

... Show More
View Publication Preview PDF
Publication Date
Tue Mar 01 2022
Journal Name
International Journal Of Nonlinear Analysis And Applications
The suggested threshold to reduce data noise for a factorial experiment
...Show More Authors

In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b

... Show More
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
An algorithm for binary codebook design based on the average bitmap replacement error (ABPRE)
...Show More Authors

In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Apr 01 2020
Journal Name
Plant Archives
Land cover change detection using satellite images based on modified spectral angle mapper method
...Show More Authors

This research depends on the relationship between the reflected spectrum, the nature of each target, area and the percentage of its presence with other targets in the unity of the target area. The changes occur in Land cover have been detected for different years using satellite images based on the Modified Spectral Angle Mapper (MSAM) processing, where Landsat satellite images are utilized using two software programming (MATLAB 7.11 and ERDAS imagine 2014). The proposed supervised classification method (MSAM) using a MATLAB program with supervised classification method (Maximum likelihood Classifier) by ERDAS imagine have been used to get farthest precise results and detect environmental changes for periods. Despite using two classificatio

... Show More
Scopus (2)
Scopus
Publication Date
Thu Sep 25 2025
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Modified LASS Method Suggestion as an additional Penalty on Principal Components Estimation – with Application-
...Show More Authors

This research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v

... Show More
View Publication Preview PDF
Publication Date
Sat Jan 01 2022
Journal Name
1st Samarra International Conference For Pure And Applied Sciences (sicps2021): Sicps2021
Analysis the average lattice strain in the crystal direction (hkl) in MgO nanoparticles by using modified Williamson-Hall method
...Show More Authors

In this work ,the modified williamos-Hall method was used to analysis the x-ray diffraction lines for powder of magnesium oxide nanoparticles (Mgo) .and for diffraction lines (111),(200),(220),(311) and (222).where by used special programs such as origin pro Lab and Get Data Graph ,to calculate the Full width at half maximum (FWHM) and integral breadth (B) to calculate the area under the curve for each of the lines of diffraction .After that , by using modified Williamson –Hall equations to determin the values of crystallite size (D),lattice strain (ε),stress( σ ) and energy (U) , where was the results are , D=17.639 nm ,ε =0.002205 , σ=0.517 and U=0.000678 respectively. And then using the scherrer method can by calculated the crystal

... Show More
View Publication
Scopus (6)
Crossref (7)
Scopus Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
The effects of human error in the banking risks - Empirical study in a number of Iraqi private banks
...Show More Authors

Abstract

This research aims to study human error effects in the banking risks in the private banks  through the measurement and testing of human error effect in every kind of banking risks types and stand on the most closely associated with the risks in order to focus on them and make appropriate processors have with respect to and increase the availability of skills and expertise required to carry out banking operations of error-free manner.

Find dealt with human error in terms of meaning and understandable, classifications and types, causes and consequences and its approaches and theories. Also addressed placed banking risks in terms of meaning and concept, species and entr

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Mar 01 2020
Journal Name
Computer Networks
An improved multi-objective evolutionary algorithm for detecting communities in complex networks with graphlet measure
...Show More Authors

View Publication
Scopus (8)
Crossref (5)
Scopus Clarivate Crossref