In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreTo maintain the security and integrity of data, with the growth of the Internet and the increasing prevalence of transmission channels, it is necessary to strengthen security and develop several algorithms. The substitution scheme is the Playfair cipher. The traditional Playfair scheme uses a small 5*5 matrix containing only uppercase letters, making it vulnerable to hackers and cryptanalysis. In this study, a new encryption and decryption approach is proposed to enhance the resistance of the Playfair cipher. For this purpose, the development of symmetric cryptography based on shared secrets is desired. The proposed Playfair method uses a 5*5 keyword matrix for English and a 6*6 keyword matrix for Arabic to encrypt the alphabets of
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreReverse Osmosis (RO) has already proved its worth as an efficient treatment method in chemical and environmental engineering applications. Various successful RO attempts for the rejection of organic and highly toxic pollutants from wastewater can be found in the literature over the last decade. Dimethylphenol is classified as a high-toxic organic compound found ubiquitously in wastewater. It poses a real threat to humans and the environment even at low concentration. In this paper, a model based framework was developed for the simulation and optimisation of RO process for the removal of dimethylphenol from wastewater. We incorporated our earlier developed and validated process model into the Species Conserving Genetic Algorithm (SCG
... Show MoreA set of hydro treating experiments are carried out on vacuum gas oil in a trickle bed reactor to study the hydrodesulfurization and hydrodenitrogenation based on two model compounds, carbazole (non-basic nitrogen compound) and acridine (basic nitrogen compound), which are added at 0–200 ppm to the tested oil, and dibenzotiophene is used as a sulfur model compound at 3,000 ppm over commercial CoMo/ Al2O3 and prepared PtMo/Al2O3. The impregnation method is used to prepare (0.5% Pt) PtMo/Al2O3. The basic sites are found to be very small, and the two catalysts exhibit good metal support interaction. In the absence of nitrogen compounds over the tested catalysts in the trickle bed reactor at temperatures of 523 to 573 K, liquid hourly space v
... Show MoreMany authors investigated the problem of the early visibility of the new crescent moon after the conjunction and proposed many criteria addressing this issue in the literature. This article presented a proposed criterion for early crescent moon sighting based on a deep-learned pattern recognizer artificial neural network (ANN) performance. Moon sight datasets were collected from various sources and used to learn the ANN. The new criterion relied on the crescent width and the arc of vision from the edge of the crescent bright limb. The result of that criterion was a control value indicating the moon's visibility condition, which separated the datasets into four regions: invisible, telescope only, probably visible, and certai
... Show MoreThe Hartley transform generalizes to the fractional Hartley transform (FRHT) which gives various uses in different fields of image encryption. Unfortunately, the available literature of fractional Hartley transform is unable to provide its inversion theorem. So accordingly original function cannot retrieve directly, which restrict its applications. The intension of this paper is to propose inversion theorem of fractional Hartley transform to overcome this drawback. Moreover, some properties of fractional Hartley transform are discussed in this paper.
Optimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show More