The Hartley transform generalizes to the fractional Hartley transform (FRHT) which gives various uses in different fields of image encryption. Unfortunately, the available literature of fractional Hartley transform is unable to provide its inversion theorem. So accordingly original function cannot retrieve directly, which restrict its applications. The intension of this paper is to propose inversion theorem of fractional Hartley transform to overcome this drawback. Moreover, some properties of fractional Hartley transform are discussed in this paper.
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreRecently, the financial mathematics has been emerged to interpret and predict the underlying mechanism that generates an incident of concern. A system of differential equations can reveal a dynamical development of financial mechanism across time. Multivariate wiener process represents the stochastic term in a system of stochastic differential equations (SDE). The standard wiener process follows a Markov chain, and hence it is a martingale (kind of Markov chain), which is a good integrator. Though, the fractional Wiener process does not follow a Markov chain, hence it is not a good integrator. This problem will produce an Arbitrage (non-equilibrium in the market) in the predicted series. It is undesired property that leads to erroneous conc
... Show MoreIn this paper, simulation studies and applications of the New Weibull-Inverse Lomax (NWIL) distribution were presented. In the simulation studies, different sample sizes ranging from 30, 50, 100, 200, 300, to 500 were considered. Also, 1,000 replications were considered for the experiment. NWIL is a fat tail distribution. Higher moments are not easily derived except with some approximations. However, the estimates have higher precisions with low variances. Finally, the usefulness of the NWIL distribution was illustrated by fitting two data sets
Steganography is a mean of hiding information within a more obvious form of
communication. It exploits the use of host data to hide a piece of information in such a way
that it is imperceptible to human observer. The major goals of effective Steganography are
High Embedding Capacity, Imperceptibility and Robustness. This paper introduces a scheme
for hiding secret images that could be as much as 25% of the host image data. The proposed
algorithm uses orthogonal discrete cosine transform for host image. A scaling factor (a) in
frequency domain controls the quality of the stego images. Experimented results of secret
image recovery after applying JPEG coding to the stego-images are included.
Enzymatic hydrolysis process of lignocellulosic biomass materials is difficult because of inherent structural features of biomass, which represents barriers that prevent complete hydrolysis; therefore, pretreatment techniques are necessary to render biomass highly digestible in enzymatic hydrolysis process. In this research, (non?) oxidative short-term lime pretreatment of willow wood was used. A weight of 11.40 g of willow wood was mixed with an excess of calcium hydroxide (0.4 g Ca(OH)2/g raw biomass) and water loading (15 g/g raw biomass). Lime pretreatment was carried out for various periods of time including 1, 2, 3.5, 5 and 6 h, with temperatures at 100, 113, 130, 147 and 1600C, and oxygen pressures as o
... Show MoreIn this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show More<p class="0abstract">Image denoising is a technique for removing unwanted signals called the noise, which coupling with the original signal when transmitting them; to remove the noise from the original signal, many denoising methods are used. In this paper, the Multiwavelet Transform (MWT) is used to denoise the corrupted image by Choosing the HH coefficient for processing based on two different filters Tri-State Median filter and Switching Median filter. With each filter, various rules are used, such as Normal Shrink, Sure Shrink, Visu Shrink, and Bivariate Shrink. The proposed algorithm is applied Salt& pepper noise with different levels for grayscale test images. The quality of the denoised image is evaluated by usi
... Show MoreThe need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as v
... Show MoreIn this paper, visible image watermarking algorithm based on biorthogonal wavelet
transform is proposed. The watermark (logo) of type binary image can be embedded in the
host gray image by using coefficients bands of the transformed host image by biorthogonal
transform domain. The logo image can be embedded in the top-left corner or spread over the
whole host image. A scaling value (α) in the frequency domain is introduced to control the
perception of the watermarked image. Experimental results show that this watermark
algorithm gives visible logo with and no losses in the recovery process of the original image,
the calculated PSNR values support that. Good robustness against attempt to remove the
watermark was s