Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery
... Show MoreA study of characteristics of the lubricant oils and the physical properties is essential to know the quality of lubricant oils. The parameters that lead to classify oils have been studied in this research. Three types of multi-grades lubricant oils were applied under changing temperatures from 25 oC to 78oC to estimate the physical properties and mixture compositions. Kinematic viscosity, viscosity gravity constant and paraffin (P), naphthenes (N) and aromatics (A) (PNA) analysis are used to predict the composition of lubricants oil. Kinematic viscosity gives good behaviors and the oxidation stability for each lubricant oils. PNA analysis predicted fractions of paraffin (XP), naphthenes (XN),
... Show MoreIn this paper an estimator of reliability function for the pareto dist. Of the first kind has been derived and then a simulation approach by Monte-Calro method was made to compare the Bayers estimator of reliability function and the maximum likelihood estimator for this function. It has been found that the Bayes. estimator was better than maximum likelihood estimator for all sample sizes using Integral mean square error(IMSE).
In this research, the Iraqi flagpole at Baghdad University, which is the longest in Baghdad, with a height of 75m, was monitored. According to the importance of this structure, the calculation of the displacement (vertical deviation) in the structure was monitored using the Total Station device, where several observations were taken at different times for two years the monitoring started from November 2016 until May 2017, at a rate of four observations for one year. The observation was processed using the least square method, and the fitting of circles, and then the data was processed. The deviation was calculated using the Matlab program to calculate the values of corrections, where
Many approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good
Binary relations or interactions among bio-entities, such as proteins, set up the essential part of any living biological system. Protein-protein interactions are usually structured in a graph data structure called "protein-protein interaction networks" (PPINs). Analysis of PPINs into complexes tries to lay out the significant knowledge needed to answer many unresolved questions, including how cells are organized and how proteins work. However, complex detection problems fall under the category of non-deterministic polynomial-time hard (NP-Hard) problems due to their computational complexity. To accommodate such combinatorial explosions, evolutionary algorithms (EAs) are proven effective alternatives to heuristics in solvin
... Show MoreA roundabout is a highway engineering concept meant to calm traffic, increase safety, reduce stop-and-go travel, reduce accidents and congestion, and decrease traffic delays. It is circular and facilitates one-way traffic flow around a central point. The first part of this study evaluated the principles and methods used to compare the capacity methods of roundabouts with different traffic conditions and geometric configurations. These methods include gap acceptance, empirical, and simulation software methods. Previous studies mentioned in this research used various methods and other new models developed by several researchers. However, this paper's main aim is to compare different roundabout capacity models for acceptabl
... Show MoreThis paper aims to identify the approaches used in assessment the credit applications by Iraqi banks, as well as which approach is most used. It also attempted to link these approaches with reduction of credit default and banks’ efficiency particularly for the Gulf Commercial Bank. The paper found that the Gulf Bank widely relies on the method of Judgment Approach for assessment the credit applications in order to select the best of them with low risk of default. In addition, the paper found that the method of Judgment Approach was very important for the Gulf Bank and it driven in reduction the ratio of credit default as percentage of total credit. However, it is important to say that the adoption of statistical approaches for
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show More