Biomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reduction in EEG complexity and decrease in EEG connectivity were investigated. Support vector machine and linear discriminate analysis methods were used to find the best combination of the EEG biomarkers to detect AD with significant performance. A total of 325,567 EEG biomarkers were investigated, and a panel of six biomarkers was identified and used to create a diagnostic model with high performance (≥85% for sensitivity and 100% for specificity).
In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods
Iraqi siliceous rocks were chosen to be used as raw materials in this study which is concern with the linear shrinkage and their related parameters. They are porcelinite from Safra area (western desert) and Kaolin Duekla, their powders were mixed in certain percentage, to shape compacts and sintered. The study followed with thermal and chemical treatments, which are calcination and acid washing. The effects on final compact properties such as linear shrinkage were studied. Linear shrinkage was calculated for sintered compacts to study the effects of calcination processes, chemical washing, weight percentage, sintering processes, loading moment were studied on this property where the compacts for groups is insulating materials.
Linear
Significant advances in horizontal well drilling technology have been made in recent years. The conventional productivity equations for single phase flowing at steady state conditions have been used and solved using Microsoft Excel for various reservoir properties and different horizontal well lengths.
The deviation between the actual field data, and that obtained by the software based on conventional equations have been adjusted to introduce some parameters inserted in the conventional equation.
The new formula for calculating flow efficiency was derived and applied with the best proposed values of coefficients ψ=0.7 and ω= 1.4. The simulated results fitted the field data.
Various reservoir and field parameters including late
This paper proposes a new method to tune a fractional order PID controller. This method utilizes both the analytic and numeric approach to determine the controller parameters. The control design specifications that must be achieved by the control system are gain crossover frequency, phase margin, and peak magnitude at the resonant frequency, where the latter is a new design specification suggested by this paper. These specifications results in three equations in five unknown variables. Assuming that certain relations exist between two variables and discretizing one of them, a performance index can be evaluated and the optimal controller parameters that minimize this performance index are selected. As a case study, a third order linear time
... Show MoreNew designs of solar using ray tracing program, have been presented for improved the performance and the out put power of the silicon solar cell, as well as reducing the cost of system working by solar energy. Two dimensional solar concentrator (Fresnel lenses) and three dimensional concentrators (parabola dish and cassegrain) were used as concentrator for photovoltaic applications (CPV). The results show that the performance efficiency and out power for crystalline silicon solar cells are improved.
The phenomenon of child labor, any activity performed by a child and represents a contribution to the production, this phenomenon had been rife in Iraqi society, after that the proportion of child labor for the age group (6-14 years) amounting to 3% in 2006, became for the same age group of 8% in 2008*.
It is recognized that each phenomenon reasons, the phenomenon which our atten
... Show MoreThe migration from IPv4 to IPv6 can not be achieved in a brief period, thus both protocols co-exist at certain years. IETF Next Generation Transition Working Group (NGtrans) developed IPv4/IPv6 transition mechanisms. Since Iraq infrastructure, including universities, companies and institutions still use IPv4 protocol only. This research article tries to highlight, discuss a required transition roadmap and extend the local knowledge and practice on IPv6. Also, it introduces a prototype model using Packet tracer (network simulator) deployed for the design and implementation of IPv6 migration. Finally, it compares and evaluates the performance of IPv6, IPv4 and dual stack using OPNET based on QoS metrics such as throughput, delay and point to
... Show MoreAbstract
The research aims to study the problem of high production costs and low quality and the use of total quality management tools to detect problems of the high cost of failure and low quality products, diagnosis, and developing appropriate solutions.
To achieve the goal, we studied the overall quality tools and its relationship with the costs and the possibility of improving quality through the use of these tools.
Was limited to these tools and study the relation to the reduction of costs and improving quality have been studied serially by the possibility of the reduction.
To achieve the goal, the study of the concept of total quality management