Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
In this work, a joint quadrature for numerical solution of the double integral is presented. This method is based on combining two rules of the same precision level to form a higher level of precision. Numerical results of the present method with a lower level of precision are presented and compared with those performed by the existing high-precision Gauss-Legendre five-point rule in two variables, which has the same functional evaluation. The efficiency of the proposed method is justified with numerical examples. From an application point of view, the determination of the center of gravity is a special consideration for the present scheme. Convergence analysis is demonstrated to validate the current method.
Comparative Analysis of Economic Policy Stability between Monarchical and Republican Systems: A Theoretical Fundamental Research
In this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.
Geotechnical engineers have always been concerned with the stabilization of slopes. For this purpose,
various methods such as retaining walls, piles, and geosynthetics may be used to increase the safety factor of slopes prone to failure. The application of stone columns may also be another potential alternative for slope stabilization. Such columns have normally been used for cohesive soil improvement. Most slope analysis and design is based on deterministic approach i.e a set of single valued design parameter are adopted and a set of single valued factor of safety (FOS) is determined. Usually the FOS is selected in view of the understanding and knowledge of the material parameters, the problem geometry, the method of analysis and the
Wind energy is one of the most common and natural resources that play a huge role in energy sector, and due to the increasing demand to improve the efficiency of wind turbines and the development of the energy field, improvements have been made to design a suitable wind turbine and obtain the most energy efficiency possible from wind. In this paper, a horizontal wind turbine blade operating under low wind speed was designed using the (BEM) theory, where the design of the turbine rotor blade is a difficult task due to the calculations involved in the design process. To understand the behavior of the turbine blade, the QBlade program was used to design and simulate the turbine rotor blade during working conditions. The design variables suc
... Show MoreIn this paper, an enhanced artificial potential field (EAPF) planner is introduced. This planner is proposed to rapidly find online solutions for the mobile robot path planning problems, when the underlying environment contains obstacles with unknown locations and sizes. The classical artificial potential field represents both the repulsive force due to the detected obstacle and the attractive force due to the target. These forces can be considered as the primary directional indicator for the mobile robot. However, the classical artificial potential field has many drawbacks. So, we suggest two secondary forces which are called the midpoint
... Show MoreSpelling correction is considered a challenging task for resource-scarce languages. The Arabic language is one of these resource-scarce languages, which suffers from the absence of a large spelling correction dataset, thus datasets injected with artificial errors are used to overcome this problem. In this paper, we trained the Text-to-Text Transfer Transformer (T5) model using artificial errors to correct Arabic soft spelling mistakes. Our T5 model can correct 97.8% of the artificial errors that were injected into the test set. Additionally, our T5 model achieves a character error rate (CER) of 0.77% on a set that contains real soft spelling mistakes. We achieved these results using a 4-layer T5 model trained with a 90% error inject
... Show MoreThe 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show More This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show More