Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreThe research objective focuses on spraying the leaves of the plant from the top and bottom through a spraying machine designed and made of aluminum with a movable arm equipped with a nozzle holder divided into three parts and each piece contains a nozzle of the type (Flat Fan 120-C3) as the machine was tried in a greenhouse with a study of the effect of changing the positions of the upper and lower piece of the tube carrying the nozzles to four levels (A1, A2, A3, A4) and the effect of pressure change on two levels (2,4) bar and studying the effect of the previous factors on some of the characteristics of the study, Spray quality on adaxial of leaf, Spray quality on the abaxial surfac
In order to select the optimal tracking of fast time variation of multipath fast time variation Rayleigh fading channel, this paper focuses on the recursive least-squares (RLS) and Extended recursive least-squares (E-RLS) algorithms and reaches the conclusion that E-RLS is more feasible according to the comparison output of the simulation program from tracking performance and mean square error over five fast time variation of Rayleigh fading channels and more than one time (send/receive) reach to 100 times to make sure from efficiency of these algorithms.
This study relates to the estimation of a simultaneous equations system for the Tobit model where the dependent variables ( ) are limited, and this will affect the method to choose the good estimator. So, we will use new estimations methods different from the classical methods, which if used in such a case, will produce biased and inconsistent estimators which is (Nelson-Olson) method and Two- Stage limited dependent variables(2SLDV) method to get of estimators that hold characteristics the good estimator .
That is , parameters will be estim
... Show More
Abstract:
The models of time series often suffer from the problem of the existence of outliers that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of
... Show MoreThe Compressional-wave (Vp) data are useful for reservoir exploration, drilling operations, stimulation, hydraulic fracturing employment, and development plans for a specific reservoir. Due to the different nature and behavior of the influencing parameters, more complex nonlinearity exists for Vp modeling purposes. In this study, a statistical relationship between compressional wave velocity and petrophysical parameters was developed from wireline log data for Jeribe formation in Fauqi oil field south Est Iraq, which is studied using single and multiple linear regressions. The model concentrated on predicting compressional wave velocity from petrophysical parameters and any pair of shear waves velocity, porosity, density, a
... Show MoreThe Compressional-wave (Vp) data are useful for reservoir exploration, drilling operations, stimulation, hydraulic fracturing employment, and development plans for a specific reservoir. Due to the different nature and behavior of the influencing parameters, more complex nonlinearity exists for Vp modeling purposes. In this study, a statistical relationship between compressional wave velocity and petrophysical parameters was developed from wireline log data for Jeribe formation in Fauqi oil field south Est Iraq, which is studied using single and multiple linear regressions. The model concentrated on predicting compressional wave velocity from petrophysical parameters and any pair of shear waves velocity, porosity, density, and
... Show MoreThe paper presents a highly accurate power flow solution, reducing the possibility of ending at local minima, by using Real-Coded Genetic Algorithm (RCGA) with system reduction and restoration. The proposed method (RCGA) is modified to reduce the total computing time by reducing the system in size to that of the generator buses, which, for any realistic system, will be smaller in number, and the load buses are eliminated. Then solving the power flow problem for the generator buses only by real-coded GA to calculate the voltage phase angles, whereas the voltage magnitudes are specified resulted in reduced computation time for the solution. Then the system is restored by calculating the voltages of the load buses in terms
... Show MoreThe present study deals with the optimum design of self supporting steel communication towers. A special technique is used to represent the tower as an equivalent hollow tapered beam with variable cross section. Then this method is employed to find the best layout of the tower among prespecified configurations. The formulation of the problem is applied to four types of tower layout
with K and X brace, with equal and unequal panels. The objective function is the total weight of the tower. The variables are the base and the top dimensions, the number of panels for the tower and member's cross section areas. The formulations of design constraints are based on the requirements of EIA and ANSI codes for allowable stresses in the members
Abstract
The method binery logistic regression and linear discrimint function of the most important statistical methods used in the classification and prediction when the data of the kind of binery (0,1) you can not use the normal regression therefore resort to binary logistic regression and linear discriminant function in the case of two group in the case of a Multicollinearity problem between the data (the data containing high correlation) It became not possible to use binary logistic regression and linear discriminant function, to solve this problem, we resort to Partial least square regression.
In this, search th
... Show More