Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used to perform a comparative analysis of the performance of the whole system. Several image test samples were used to test the performance behavior. The simulation results show the efficiency of these combined transformations when LZW is used in the field of data compression. Compression outcomes are encouraging and display a significant reduction in image file size at good resolution.
The research aims to evaluate the suppliers at Diyala general electric industries company conducted in an environment of uncertainty and fuzzy where there is no particular system followed by the company, and also aims to use the problem of traveling salesman problem in the process of transporting raw materials from suppliers to the company in a fuzzy environment. Therefore, a system based on mathematical methods and quantity was developed to evaluate the suppliers. Fuzzy inference system (FIS) and fuzzy set theory were used to solve this problem through (Matlab) and the problem of the traveling salesman in two stages was also solved by the first stage of eliminating the fuzzing of the environment using the rank function method, w
... Show MoreIn this study, mean free path and positron elastic-inelastic scattering are modeled for the elements hydrogen (H), carbon (C), nitrogen (N), oxygen (O), phosphorus (P), sulfur (S), chlorine (Cl), potassium (K) and iodine (I). Despite the enormous amounts of data required, the Monte Carlo (MC) method was applied, allowing for a very accurate simulation of positron interaction collisions in live cells. Here, the MC simulation of the interaction of positrons was reported with breast, liver, and thyroid at normal incidence angles, with energies ranging from 45 eV to 0.2 MeV. The model provides a straightforward analytic formula for the random sampling of positron scattering. ICRU44 was used to compile the elemental composition data. In this
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreRadio observations from astronomical sources like supernovae became one the most important sources of information about the physical properties of those objects. However, such radio observations are affected by various types of noise such as those from sky, background, receiver, and the system itself. Therefore, it is essential to eliminate or reduce these undesired noise from the signals in order to ensure accurate measurements and analysis of radio observations. One of the most commonly used methods for reducing the noise is to use a noise calibrator. In this study, the 3-m Baghdad University Radio Telescope (BURT) has been used to observe crab nebula with and without using a calibration unit in order to investigate its impact on the sign
... Show MoreWeb testing is very important method for users and developers because it gives the ability to detect errors in applications and check their quality to perform services to users performance abilities, user interface, security and other different types of web testing that may occur in web application. This paper focuses on a major branch of the performance testing, which is called the load testing. Load testing depends on an important elements called request time and response time. From these elements, it can be decided if the performance time of a web application is good or not. In the experimental results, the load testing applied on the website (http://ihcoedu.uobaghdad.edu.iq) the main home page and all the science departments pages. In t
... Show MorePerchloroethylene (PERC) is commonly used as a dry-cleaning solvent, it is attributed to many deleterious effects in the biological system. The study aimed to investigate the harmful effect associated with PERC exposure among dry-cleaning workers. The study was carried out on 58 adults in two groups. PERC-exposed group; include thirty-two male dry-cleaning workers using PERC as a dry-cleaning solvent and twenty-six healthy non-exposed subjects. History of PERC exposure, use of personal protection equipment (PPE), safety measurement of the exposed group was recorded. Blood sample was taken from each participant for measurement of hematological markers, liver and kidney function tests. The results showed that 28.1% of the workers were usin
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreMetal oxide nanoparticles, including iron oxide, are highly considered as one of the most important species of nanomaterials in a varied range of applications due to their optical, magnetic, and electrical properties. Iron oxides are common compounds, extensive in nature, and easily synthesized in the laboratory. In this paper, iron oxide nanoparticles were prepared by co-precipitation of (Fe+2) and (Fe+3) ions, using iron (II and III) sulfate as precursor material and NH4OH solution as solvent at 90°C. After the synthesis of iron oxide particles, it was characterized using X-ray diffraction (XRD), infrared spectroscopy (FTIR), and scanning electron microscopy (SEM). These tests confirmed the obtaining o
... Show More