The second leading cause of death and one of the most common causes of disability in the world is stroke. Researchers have found that brain–computer interface (BCI) techniques can result in better stroke patient rehabilitation. This study used the proposed motor imagery (MI) framework to analyze the electroencephalogram (EEG) dataset from eight subjects in order to enhance the MI-based BCI systems for stroke patients. The preprocessing portion of the framework comprises the use of conventional filters and the independent component analysis (ICA) denoising approach. Fractal dimension (FD) and Hurst exponent (Hur) were then calculated as complexity features, and Tsallis entropy (TsEn) and dispersion entropy (DispEn) were assessed as irregularity parameters. The MI-based BCI features were then statistically retrieved from each participant using two-way analysis of variance (ANOVA) to demonstrate the individuals’ performances from four classes (left hand, right hand, foot, and tongue). The dimensionality reduction algorithm, Laplacian Eigenmap (LE), was used to enhance the MI-based BCI classification performance. Utilizing k-nearest neighbors (KNN), support vector machine (SVM), and random forest (RF) classifiers, the groups of post-stroke patients were ultimately determined. The findings show that LE with RF and KNN obtained 74.48% and 73.20% accuracy, respectively; therefore, the integrated set of the proposed features along with ICA denoising technique can exactly describe the proposed MI framework, which may be used to explore the four classes of MI-based BCI rehabilitation. This study will help clinicians, doctors, and technicians make a good rehabilitation program for people who have had a stroke.
The present study discusses the problem based learning in Iraqi classroom. This method aims to involve all learners in collaborative activities and it is learner-centered method. To fulfill the aims and verify the hypothesis which reads as follow” It is hypothesized that there is no statistically significant differences between the achievements of Experimental group and control group”. Thirty learners are selected to be the sample of present study.Mann-Whitney Test for two independent samples is used to analysis the results. The analysis shows that experimental group’s members who are taught according to problem based learning gets higher scores than the control group’s members who are taught according to traditional method. This
... Show MoreThe need for participants’ performance assessments in academia and industry has been a growing concern. It has attendance, among other metrics, is a key factor in engendering a holistic approach to decision-making. For institutions or organizations where managing people is an important yet challenging task, attendance tracking and management could be employed to improve this seemingly time-consuming process while keeping an accurate attendance record. The manual/quasi-analog approach of taking attendance in some institutions could be unreliable and inefficient, leading to inaccurate computation of attendance rates and data loss. This work, therefore, proposes a system that employs embedded technology and a biometric/ w
... Show MoreRouting protocols are responsible for providing reliable communication between the source and destination nodes. The performance of these protocols in the ad hoc network family is influenced by several factors such as mobility model, traffic load, transmission range, and the number of mobile nodes which represents a great issue. Several simulation studies have explored routing protocol with performance parameters, but few relate to various protocols concerning routing and Quality of Service (QoS) metrics. This paper presents a simulation-based comparison of proactive, reactive, and multipath routing protocols in mobile ad hoc networks (MANETs). Specifically, the performance of AODV, DSDV, and AOMDV protocols are evaluated and analyz
... Show MoreIn networking communication systems like vehicular ad hoc networks, the high vehicular mobility leads to rapid shifts in vehicle densities, incoherence in inter-vehicle communications, and challenges for routing algorithms. It is necessary that the routing algorithm avoids transmitting the pockets via segments where the network density is low and the scale of network disconnections is high as this could lead to packet loss, interruptions and increased communication overhead in route recovery. Hence, attention needs to be paid to both segment status and traffic. The aim of this paper is to present an intersection-based segment aware algorithm for geographic routing in vehicular ad hoc networks. This algorithm makes available the best route f
... Show MoreData steganography is a technique used to hide data, secret message, within another data, cover carrier. It is considered as a part of information security. Audio steganography is a type of data steganography, where the secret message is hidden in audio carrier. This paper proposes an efficient audio steganography method that uses LSB technique. The proposed method enhances steganography performance by exploiting all carrier samples and balancing between hiding capacity and distortion ratio. It suggests an adaptive number of hiding bits for each audio sample depending on the secret message size, the cover carrier size, and the signal to noise ratio (SNR). Comparison results show that the proposed method outperforms state of the art methods
... Show MoreThe physical substance at high energy level with specific circumstances; tend to behave harsh and complicated, meanwhile, sustaining equilibrium or non-equilibrium thermodynamic of the system. Measurement of the temperature by ordinary techniques in these cases is not applicable at all. Likewise, there is a need to apply mathematical models in numerous critical applications to measure the temperature accurately at an atomic level of the matter. Those mathematical models follow statistical rules with different distribution approaches of quantities energy of the system. However, these approaches have functional effects at microscopic and macroscopic levels of that system. Therefore, this research study represents an innovative of a wi
... Show MoreFinger vein recognition and user identification is a relatively recent biometric recognition technology with a broad variety of applications, and biometric authentication is extensively employed in the information age. As one of the most essential authentication technologies available today, finger vein recognition captures our attention owing to its high level of security, dependability, and track record of performance. Embedded convolutional neural networks are based on the early or intermediate fusing of input. In early fusion, pictures are categorized according to their location in the input space. In this study, we employ a highly optimized network and late fusion rather than early fusion to create a Fusion convolutional neural network
... Show More