The rapid increase in the number of older people with Alzheimer's disease (AD) and other forms of dementia represents one of the major challenges to the health and social care systems. Early detection of AD makes it possible for patients to access appropriate services and to benefit from new treatments and therapies, as and when they become available. The onset of AD starts many years before the clinical symptoms become clear. A biomarker that can measure the brain changes in this period would be useful for early diagnosis of AD. Potentially, the electroencephalogram (EEG) can play a valuable role in early detection of AD. Damage in the brain due to AD leads to changes in the information processing activity of the brain and the EEG which can be quantified as a biomarker. The objective of the study reported in this paper is to develop robust EEG-based biomarkers for detecting AD in its early stages. We present a new approach to quantify the slowing of the EEG, one of the most consistent features at different stages of dementia, based on changes in the EEG amplitudes (ΔEEG A ). The new approach has sensitivity and specificity values of 100% and 88.88%, respectively, and outperformed the Lempel-Ziv Complexity (LZC) approach in discriminating between AD and normal subjects.
Most Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreHuman interaction technology based on motion capture (MoCap) systems is a vital tool for human kinematics analysis, with applications in clinical settings, animations, and video games. We introduce a new method for analyzing and estimating dorsal spine movement using a MoCap system. The captured data by the MoCap system are processed and analyzed to estimate the motion kinematics of three primary regions; the shoulders, spine, and hips. This work contributes a non-invasive and anatomically guided framework that enables region-specific analysis of spinal motion which could be used as a clinical alternative to invasive measurement techniques. The hierarchy of our model consists of five main levels; motion capture system settings, marker data
... Show MoreIn this paper, a design of the broadband thin metamaterial absorber (MMA) is presented. Compared with the previously reported metamaterial absorbers, the proposed structure provides a wide bandwidth with a compatible overall size. The designed absorber consists of a combination of octagon disk and split octagon resonator to provide a wide bandwidth over the Ku and K bands' frequency range. Cheap FR-4 material is chosen to be a substate of the proposed absorber with 1.6 thicknesses and 6.5×6.5 overall unit cell size. CST Studio Suite was used for the simulation of the proposed absorber. The proposed absorber provides a wide absorption bandwidth of 14.4 GHz over a frequency range of 12.8-27.5 GHz with more than %90 absorp
... Show MoreLuminescent sensor membranes and sensor microplates are presented for continuous or high-throughput wide-range measurement of pH based on a europium probe.
Cyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreMeasurement of construction performance is essential to a clear image of the present situation. This monitoring by the management team is necessary to identify locations where performance is exceptionally excellent or poor and to identify the primary reasons so that the lessons gained may be exported to the firm and its progress strengthened. This research attempts to construct an integrated mathematical model utilizing one of the recent methodologies for dealing with the fuzzy representation of experts’ knowledge and judgment considering hesitancy called spherical fuzzy analytic hierarchy process (SFAHP) method to assess the contractor’s performance per the project performance pa
This paper proposes feedback linearization control (FBLC) based on function approximation technique (FAT) to regulate the vibrational motion of a smart thin plate considering the effect of axial stretching. The FBLC includes designing a nonlinear control law for the stabilization of the target dynamic system while the closedloop dynamics are linear with ensured stability. The objective of the FAT is to estimate the cubic nonlinear restoring force vector using the linear parameterization of weighting and orthogonal basis function matrices. Orthogonal Chebyshev polynomials are used as strong approximators for adaptive schemes. The proposed control architecture is applied to a thin plate with a large deflection that stimulates the axial loadin
... Show MoreThe heat exchanger is a device used to transfer heat energy between two fluids, hot and cold. In this work, an output feedback adaptive sliding mode controller is designed to control the temperature of the outlet cold water for plate heat exchanger. The measurement of the outlet cold temperature is the only information required. Hence, a sliding mode differentiator was designed to estimate the time derivative of outlet hot water temperature, which it is needed for constructing a sliding variable. The discontinuous gain value of the sliding mode controller is adapted according to a certain adaptation law. Two constraints which imposed on the volumetric flow rate of outlet cold (control input) were considered within the rules of the proposed
... Show More