Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used to perform a comparative analysis of the performance of the whole system. Several image test samples were used to test the performance behavior. The simulation results show the efficiency of these combined transformations when LZW is used in the field of data compression. Compression outcomes are encouraging and display a significant reduction in image file size at good resolution.
The goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed
thirty adult NewZealand rabbits used in this study, they were divided in to two groups (control and treaded with Helium — Neon laser). A square skin flap done on the medial aspect of the auricle of both sides, a square piece of cartilage incised, pealed out from each auricle and fixed in the site of the other, then the flaps sutured .The site of the operation in the rabbits of the treated group were irradiated using a Helium —Neon laser with (5mw) power for (10 days) began after the operation directly, (3 rabbits) from each group used for collection of specimens for histopathological examination at the weeks (1,2,3,4, & 6) weeks post the operation .The results revealed Early invasion of the matrix with elastic fibers which continue to t
... Show MoreSemantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po
Many academics have concentrated on applying machine learning to retrieve information from databases to enable researchers to perform better. A difficult issue in prediction models is the selection of practical strategies that yield satisfactory forecast accuracy. Traditional software testing techniques have been extended to testing machine learning systems; however, they are insufficient for the latter because of the diversity of problems that machine learning systems create. Hence, the proposed methodologies were used to predict flight prices. A variety of artificial intelligence algorithms are used to attain the required, such as Bayesian modeling techniques such as Stochastic Gradient Descent (SGD), Adaptive boosting (ADA), Decision Tre
... Show MoreAs s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show MoreMode filtering technique is one of the most desired techniques in optical fiber communication systems, especially for multiple input multiple output (MIMO) coherent optical communications that have mode-dependent losses in communication channels. In this work, a special type of optical fiber sensing head was used, where it utilizes DCF13 that is made by Thorlabs and has two numerical apertures (NA’s). One is for core and 1st cladding region, while the 2nd relates the 1st cladding to the 2nd cladding. Etching process using 40 % hydro-fluoric (HF) acid was performed on the DCF13 with variable time in minutes. Investigation of the correlation between the degree of etching and the re
Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreImage segmentation can be defined as a cutting or segmenting process of the digital image into many useful points which are called segmentation, that includes image elements contribute with certain attributes different form Pixel that constitute other parts. Two phases were followed in image processing by the researcher in this paper. At the beginning, pre-processing image on images was made before the segmentation process through statistical confidence intervals that can be used for estimate of unknown remarks suggested by Acho & Buenestado in 2018. Then, the second phase includes image segmentation process by using "Bernsen's Thresholding Technique" in the first phase. The researcher drew a conclusion that in case of utilizing
... Show MoreIn this paper, a computational method for solving optimal problem is presented, using indirect method (spectral methodtechnique) which is based on Boubaker polynomial. By this method the state and the adjoint variables are approximated by Boubaker polynomial with unknown coefficients, thus an optimal control problem is transformed to algebraic equations which can be solved easily, and then the numerical value of the performance index is obtained. Also the operational matrices of differentiation and integration have been deduced for the same polynomial to help solving the problems easier. A numerical example was given to show the applicability and efficiency of the method. Some characteristics of this polynomial which can be used for solvin
... Show More