A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThe aim of this research is to study the surface alteration characteristics and surface morphology of the superhydrophobic/hydrophobic nanocomposite coatings prepared by an electrospinning method to coat various materials such as glass and metal. This is considered as a low cost method of fabrication for polymer solutions of Polystyrene (PS), Polymethylmethacrylate (PMMA) and Silicone Rubber (RTV). Si were prepared in various wt% of composition for each solutions. Contact angle measurement, surface tension, viscosity, roughness tests were calculated for all specimens. SEM showed the morphology of the surfaces after coated. PS and PMMA showed superhydrophobic properties for metal substrate, while Si showed hydroph
... Show MoreOrthogonal polynomials and their moments have significant role in image processing and computer vision field. One of the polynomials is discrete Hahn polynomials (DHaPs), which are used for compression, and feature extraction. However, when the moment order becomes high, they suffer from numerical instability. This paper proposes a fast approach for computing the high orders DHaPs. This work takes advantage of the multithread for the calculation of Hahn polynomials coefficients. To take advantage of the available processing capabilities, independent calculations are divided among threads. The research provides a distribution method to achieve a more balanced processing burden among the threads. The proposed methods are tested for va
... Show MoreThree-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show MoreWe have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show MoreThe research aims to achieve a set of objectives, the most important of which is determining the extent to which the auditors of the research sample in the Federal Bureau of Financial Supervision adhere to the requirements of the quality control system according to the Iraqi Audit Manual No. The federal financial / research sample with the quality control system according to the Iraqi audit guide No. 7), and the researcher seeks to test the main research hypothesis and sub-hypotheses, and to achieve this, a questionnaire was designed by (Google Form) and distributed electronically to the elements of the research sample, Through the statistical package program (SPSS), the results of the questionnaire were analysed. In light of the applied
... Show More