Channel estimation and synchronization are considered the most challenging issues in Orthogonal Frequency Division Multiplexing (OFDM) system. OFDM is highly affected by synchronization errors that cause reduction in subcarriers orthogonality, leading to significant performance degradation. The synchronization errors cause two issues: Symbol Time Offset (STO), which produces inter symbol interference (ISI) and Carrier Frequency Offset (CFO), which results in inter carrier interference (ICI). The aim of the research is to simulate Comb type pilot based channel estimation for OFDM system showing the effect of pilot numbers on the channel estimation performance and propose a modified estimation method for STO with less numb
... Show MoreBackground: Expectoration of blood that originated in the lungs or bronchial tubes is a frightening symptom for patients and often is a manifestation of significant and possibly dangerous underlying disease. Tuberculosis was and still one of the common causes followed by bronchiactasis , bronchitis, and lung cancer. Objectives: The aim of this study is to find the frequency of causes of respiratory tract bleeding in 100 patients attending alkindy teaching hospital.Type of the study: : Prospective descriptive observational study Methods of a group of patients consist of one hundred consecutive adult patients, with Lower respiratory tract bleeding are studied. History, physical examination, and a group of selected investigations performed,
... Show MoreTremendous efforts have been exerted to understand first language acquisition to facilitate second language learning. The problem lies in the difficulty of mastering English language and adapting a theory that helps in overcoming the difficulties facing students. This study aims to apply Thomasello's theory of language mastery through usage. It assumes that adults can learn faster than children and can learn the language separately, and far from academic education. Tomasello (2003) studied the stages of language acquisition for children, and developed his theory accordingly. Some studies, such as: (Ghalebi and Sadighi, 2015, Arvidsson, 2019; Munoz, 2019; Verspoor and Hong, 2013) used this theory when examining language acquisition. Thus,
... Show MoreSemantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po
Recently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreProducing pseudo-random numbers (PRN) with high performance is one of the important issues that attract many researchers today. This paper suggests pseudo-random number generator models that integrate Hopfield Neural Network (HNN) with fuzzy logic system to improve the randomness of the Hopfield Pseudo-random generator. The fuzzy logic system has been introduced to control the update of HNN parameters. The proposed model is compared with three state-ofthe-art baselines the results analysis using National Institute of Standards and Technology (NIST) statistical test and ENT test shows that the projected model is statistically significant in comparison to the baselines and this demonstrates the competency of neuro-fuzzy based model to produce
... Show MoreA Multiple System Biometric System Based on ECG Data
Password authentication is popular approach to the system security and it is also very important system security procedure to gain access to resources of the user. This paper description password authentication method by using Modify Bidirectional Associative Memory (MBAM) algorithm for both graphical and textual password for more efficient in speed and accuracy. Among 100 test the accuracy result is 100% for graphical and textual password to authenticate a user.
In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform