Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless compression scheme of first stage that corresponding to second stage. The tested results shown are promising in both two stages, that implicilty enhanced the performance of traditional polynomial model in terms of compression ratio , and preresving image quality.
Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreThe study of vegetative change of cities is one of the most important studies related to human life because of its direct correlation with the temporal conditions that occur. These include the economic problems that force people to move and look for job opportunities in the city, which leads to an increase in the population density of cities, especially for cities with an important economic and administrative location as in the capital city of Baghdad. In this study, the effect of the increasing in population density was analyzed on the urban planning of Baghdad city. The decreasing in vegetation was due to the increasing of urban areas on the outskirts of the city, which led to an increase in its area. Moreover, urban cities increased t
... Show MoreAs an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based
... Show MoreIn this article, Convolution Neural Network (CNN) is used to detect damage and no damage images form satellite imagery using different classifiers. These classifiers are well-known models that are used with CNN to detect and classify images using a specific dataset. The dataset used belongs to the Huston hurricane that caused several damages in the nearby areas. In addition, a transfer learning property is used to store the knowledge (weights) and reuse it in the next task. Moreover, each applied classifier is used to detect the images from the dataset after it is split into training, testing and validation. Keras library is used to apply the CNN algorithm with each selected classifier to detect the images. Furthermore, the performa
... Show MoreIn this paper the modified trapezoidal rule is presented for solving Volterra linear Integral Equations (V.I.E) of the second kind and we noticed that this procedure is effective in solving the equations. Two examples are given with their comparison tables to answer the validity of the procedure.
Signal denoising is directly related to sample estimation of received signals, either by estimating the equation parameters for the target reflections or the surrounding noise and clutter accompanying the data of interest. Radar signals recorded using analogue or digital devices are not immune to noise. Random or white noise with no coherency is mainly produced in the form of random electrons, and caused by heat, environment, and stray circuitry loses. These factors influence the output signal voltage, thus creating detectable noise. Differential Evolution (DE) is an effectual, competent, and robust optimisation method used to solve different problems in the engineering and scientific domains, such as in signal processing. This paper looks
... Show MoreRutting in asphalt mixtures is a very common type of distress. It occurs due to the heavy load applied and slow movement of traffic. Rutting needs to be predicted to avoid major deformation to the pavement. A simple linear viscous method is used in this paper to predict the rutting in asphalt mixtures by using a multi-layer linear computer programme (BISAR). The material properties were derived from the Repeated Load Axial Test (RLAT) and represented by a strain-dependent axial viscosity. The axial viscosity was used in an incremental multi-layer linear viscous analysis to calculate the deformation rate during each increment, and therefore the overall development of rutting. The method has been applied for six mixtures and at different tem
... Show More