In this study, a traumatic spinal cord injury (TSCI) classification system is proposed using a convolutional neural network (CNN) technique with automatically learned features from electromyography (EMG) signals for a non-human primate (NHP) model. A comparison between the proposed classification system and a classical classification method (k-nearest neighbors, kNN) is also presented. Developing such an NHP model with a suitable assessment tool (i.e., classifier) is a crucial step in detecting the effect of TSCI using EMG, which is expected to be essential in the evaluation of the efficacy of new TSCI treatments. Intramuscular EMG data were collected from an agonist/antagonist tail muscle pair for the pre- and post-spinal cord lesion from five Macaca fasicularis monkeys. The proposed classifier is based on a CNN using filtered segmented EMG signals from the pre- and post-lesion periods as inputs, while the kNN is designed using four hand-crafted EMG features. The results suggest that the CNN provides a promising classification technique for TSCI, compared to conventional machine learning classification. The kNN with hand-crafted EMG features classified the pre- and post-lesion EMG data with an F-measure of 89.7% and 92.7% for the left- and right-side muscles, respectively, while the CNN with the EMG segments classified the data with an F-measure of 89.8% and 96.9% for the left- and right-side muscles, respectively. Finally, the proposed deep learning classification model (CNN), with its learning ability of high-level features using EMG segments as inputs, shows high potential and promising results for use as a TSCI classification system. Future studies can confirm this finding by considering more subjects.
In this paper, we derive and prove the stability bounds of the momentum coefficient µ and the learning rate ? of the back propagation updating rule in Artificial Neural Networks .The theoretical upper bound of learning rate ? is derived and its practical approximation is obtained
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreProjects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
Each phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show MoreThe question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreA new human-based heuristic optimization method, named the Snooker-Based Optimization Algorithm (SBOA), is introduced in this study. The inspiration for this method is drawn from the traits of sales elites—those qualities every salesperson aspires to possess. Typically, salespersons strive to enhance their skills through autonomous learning or by seeking guidance from others. Furthermore, they engage in regular communication with customers to gain approval for their products or services. Building upon this concept, SBOA aims to find the optimal solution within a given search space, traversing all positions to obtain all possible values. To assesses the feasibility and effectiveness of SBOA in comparison to other algorithms, we conducte
... Show MoreDue to advancements in computer science and technology, impersonation has become more common. Today, biometrics technology is widely used in various aspects of people's lives. Iris recognition, known for its high accuracy and speed, is a significant and challenging field of study. As a result, iris recognition technology and biometric systems are utilized for security in numerous applications, including human-computer interaction and surveillance systems. It is crucial to develop advanced models to combat impersonation crimes. This study proposes sophisticated artificial intelligence models with high accuracy and speed to eliminate these crimes. The models use linear discriminant analysis (LDA) for feature extraction and mutual info
... Show MoreTo translate sustainable concepts into sustainable structure, there is a require a collaborative work and technology to be innovated, such as BIM, to connect and organize different levels of industry e.g. decision-makers, contractors, economists, architects, urban planners, construction supplies and a series of urban planning and strategic infrastructure for operate, manage and maintain the facilities. This paper will investigate the BIM benefits as a project management tool, its effectiveness in sustainable decision making, also the benefit for the local industry key stakeholders by encouraging the BIM use as a project management tool to produce a sustainable building project. This p