In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates (increase compression ratios) with little reduction of performance (PSNR).
In this paper, we investigate two stress-strength models (Bounded and Series) in systems reliability based on Generalized Inverse Rayleigh distribution. To obtain some estimates of shrinkage estimators, Bayesian methods under informative and non-informative assumptions are used. For comparison of the presented methods, Monte Carlo simulations based on the Mean squared Error criteria are applied.
Flexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best
... Show MoreImage segmentation using bi-level thresholds works well for straightforward scenarios; however, dealing with complex images that contain multiple objects or colors presents considerable computational difficulties. Multi-level thresholding is crucial for these situations, but it also introduces a challenging optimization problem. This paper presents an improved Reptile Search Algorithm (RSA) that includes a Gbest operator to enhance its performance. The proposed method determines optimal threshold values for both grayscale and color images, utilizing entropy-based objective functions derived from the Otsu and Kapur techniques. Experiments were carried out on 16 benchmark images, which inclu
In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show Moresingle and binary competitive sorption of phenol and p-nitrophenol onto clay modified with
quaternary ammonium (Hexadecyltrimethyl ammonium ) was investigated to obtain the
adsorption isotherms constants for each solutes. The modified clay was prepared from
blending of local bentonite with quaternary ammonium . The organoclay was characterized
by cation exchange capacity. and surface area. The results show that paranitrophenol is
being adsorbed faster than phenol . The experimental data for each solute was fitted well with
the Freundlich isotherm model for single solute and with the combination of Freundlich-
Langmuier model for binary system .
In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreWrestling Judo, one of the sports that have seen greatdevelopment in recent years in the world, requiring preparationphysically special, which is to be determined physical aptitude of thebad functional efficiency of the heart and lungs, Efficient physicalclosely linked to the ability of the player performance, as the physicalaptitude to play an important role the possibility of control over theaspects and physical skills during training and competition.The study aims to determine the effect of training on anaerobicendurance according to the average (30-60 sec) in the development ofphysical aptitude for judo players. Used a much more extremeexperimental method on a sample was Blaabat national teamwrestling judo and numbers of 16 for the play
... Show MoreThe undetected error probability is an important measure to assess the communication reliability provided by any error coding scheme. Two error coding schemes namely, Joint crosstalk avoidance and Triple Error Correction (JTEC) and JTEC with Simultaneous Quadruple Error Detection (JTEC-SQED), provide both crosstalk reduction and multi-bit error correction/detection features. The available undetected error probability model yields an upper bound value which does not give accurate estimation on the reliability provided. This paper presents an improved mathematical model to estimate the undetected error probability of these two joint coding schemes. According to the decoding algorithm the errors are classified into patterns and their decoding
... Show More