Various speech enhancement Algorithms (SEA) have been developed in the last few decades. Each algorithm has its advantages and disadvantages because the speech signal is affected by environmental situations. Distortion of speech results in the loss of important features that make this signal challenging to understand. SEA aims to improve the intelligibility and quality of speech that different types of noise have degraded. In most applications, quality improvement is highly desirable as it can reduce listener fatigue, especially when the listener is exposed to high noise levels for extended periods (e.g., manufacturing). SEA reduces or suppresses the background noise to some degree, sometimes called noise suppression algorithms. In this research, the design of SEA based on different speech models (Laplacian model or Gaussian model) has been implemented using two types of discrete transforms, which are Discrete Tchebichef Transform and Discrete Tchebichef-Krawtchouk Transforms. The proposed estimator consists of dual stages of a wiener filter that can effectively estimate the clean speech signal. The evaluation measures' results show the proposed SEA's ability to enhance the noisy speech signal based on a comparison with other types of speech models and a self-comparison based on different types and levels of noise. The presented algorithm's improvements ratio regarding the average SNRseq are 1.96, 2.12, and 2.03 for Buccaneer, White, and Pink noise, respectively.
A Multiple System Biometric System Based on ECG Data
Password authentication is popular approach to the system security and it is also very important system security procedure to gain access to resources of the user. This paper description password authentication method by using Modify Bidirectional Associative Memory (MBAM) algorithm for both graphical and textual password for more efficient in speed and accuracy. Among 100 test the accuracy result is 100% for graphical and textual password to authenticate a user.
In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
Image pattern classification is considered a significant step for image and video processing. Although various image pattern algorithms have been proposed so far that achieved adequate classification, achieving higher accuracy while reducing the computation time remains challenging to date. A robust image pattern classification method is essential to obtain the desired accuracy. This method can be accurately classify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism. Moreover, to date, most of the existing studies are focused on evaluating their methods based on specific orthogonal moments, which limits the understanding of their potential application to various Discrete Orthogonal Moments (DOM
... Show MoreMany of accurate inertial guided missilc systems need to use more complex mathematical calculations and require a high speed processing to ensure the real-time opreation. This will give rise to the need of developing an effcint
Traditionally, path selection within routing is formulated as a shortest path optimization problem. The objective function for optimization could be any one variety of parameters such as number of hops, delay, cost...etc. The problem of least cost delay constraint routing is studied in this paper since delay constraint is very common requirement of many multimedia applications and cost minimization captures the need to
distribute the network. So an iterative algorithm is proposed in this paper to solve this problem. It is appeared from the results of applying this algorithm that it gave the optimal path (optimal solution) from among multiple feasible paths (feasible solutions).
Gray-Scale Image Brightness/Contrast Enhancement with Multi-Model
Histogram linear Contrast Stretching (MMHLCS) method
Abstract
Travel Time estimation and reliability measurement is an important issues for improving operation efficiency and safety of traffic roads networks. The aim of this research is the estimation of total travel time and distribution analysis for three selected links in Palestine Arterial Street in Baghdad city. Buffer time index results in worse reliability conditions. Link (2) from Bab Al Mutham intersection to Al-Sakara intersection produced a buffer index of about 36% and 26 % for Link (1) Al-Mawall intersection to Bab Al- Mutham intersection and finally for link (3) which presented a 24% buffer index. These illustrated that the reliability get worst for link
... Show MoreIn this research, a study is introduced on the effect of several environmental factors on the performance of an already constructed quality inspection system, which was designed using a transfer learning approach based on convolutional neural networks. The system comprised two sets of layers, transferred layers set from an already trained model (DenseNet121) and a custom classification layers set. It was designed to discriminate between damaged and undamaged helical gears according to the configuration of the gear regardless to its dimensions, and the model showed good performance discriminating between the two products at ideal conditions of high-resolution images.
So, this study aimed at testing the system performance at poor s
... Show More