Biomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reduction in EEG complexity and decrease in EEG connectivity were investigated. Support vector machine and linear discriminate analysis methods were used to find the best combination of the EEG biomarkers to detect AD with significant performance. A total of 325,567 EEG biomarkers were investigated, and a panel of six biomarkers was identified and used to create a diagnostic model with high performance (≥85% for sensitivity and 100% for specificity).
A perturbed linear system with property of strong observability ensures that there is a sliding mode observer to estimate the unknown form inputs together with states estimation. In the case of the electro-hydraulic system with piston position measured output, the above property is not met. In this paper, the output and its derivatives estimation were used to build a dynamic structure that satisfy the condition of strongly observable. A high order sliding mode observer (HOSMO) was used to estimate both the resulting unknown perturbation term and the output derivatives. Thereafter with one signal from the whole system (piton position), the piston position make tracking to desire one with a simple linear output feedback controller after ca
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreA novel series of chitosan derivatives were synthesized via reaction of chitosan with carbonyl compounds and grafted it’s by with different amine compounds substituted hydrogen. The produced polymers were characterized by different analyses FTIR, 1HCNMR, XRD, DSC and TGA. Solubility in water as well as many solvent was investigated, antibacterial activity of chitosan and its derivatives against two types of bacteria E. coli and S. aureus was also investigated. The results showed that derivatives sort of have antibacterial activities against Esherichia coli (Gram negative) better than chitosan whilst compound IX has better antibacterial against Staphylococcus aureus (Gram positive). SEM analysis showed that increase of surface roughness wi
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreTraffic management at road intersections is a complex requirement that has been an important topic of research and discussion. Solutions have been primarily focused on using vehicular ad hoc networks (VANETs). Key issues in VANETs are high mobility, restriction of road setup, frequent topology variations, failed network links, and timely communication of data, which make the routing of packets to a particular destination problematic. To address these issues, a new dependable routing algorithm is proposed, which utilizes a wireless communication system between vehicles in urban vehicular networks. This routing is position-based, known as the maximum distance on-demand routing algorithm (MDORA). It aims to find an optimal route on a hop-by-ho
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThe concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present s
... Show MoreThe load shedding scheme has been extensively implemented as a fast solution for unbalance conditions. Therefore, it's crucial to investigate supply-demand balancing in order to protect the network from collapsing and to sustain stability as possible, however its implementation is mostly undesirable. One of the solutions to minimize the amount of load shedding is the integration renewable energy resources, such as wind power, in the electric power generation could contribute significantly to minimizing power cuts as it is ability to positively improving the stability of the electric grid. In this paper propose a method for shedding the load base on the priority demands with incorporating the wind po
... Show More