Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
Background: Unlike normal EEG patterns, the epileptiform abnormal pattern is characterized by different mor phologies such as the high-frequency oscillations (HFOs) of ripples on spikes, spikes and waves, continuous and sporadic spikes, and ploy2 spikes. Several studies have reported that HFOs can be novel biomarkers in human epilepsy study. S) Method: To regenerate and investigate these patterns, we have proposed three large scale brain network models (BNM by linking the neural mass model (NMM) of Stefanescu-Jirsa 2D (S-J 2D) with our own structural con nectivity derived from the realistic biological data, so called, large-scale connectivity connectome. These models include multiple network connectivity of brain regions at different
... Show MoreMachine learning (ML) is a key component within the broader field of artificial intelligence (AI) that employs statistical methods to empower computers with the ability to learn and make decisions autonomously, without the need for explicit programming. It is founded on the concept that computers can acquire knowledge from data, identify patterns, and draw conclusions with minimal human intervention. The main categories of ML include supervised learning, unsupervised learning, semisupervised learning, and reinforcement learning. Supervised learning involves training models using labelled datasets and comprises two primary forms: classification and regression. Regression is used for continuous output, while classification is employed
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThis study reports testing results of the transient response of T-shape concrete deep beams with large openings due to impact loading. Seven concrete deep beams with openings including two ordinary reinforced, four partially prestressed, and one solid ordinary reinforced as a reference beam were fabricated and tested. The effects of prestressing strand position and the intensity of the impact force were investigated. Two values for the opening’s depth relative to the beam cross-section dimensions were inspected under the effect of an impacting mass repeatedly dropped from different heights. The study revealed that the beam’s transient deflection was increased by about 50% with gre
Background:Open reduction and internal fixation (ORIF) of using miniplates and screws is the treatment of choice of mandibular fractures. It is important to know both: the region where the bone providesafirm anchorage, andthe topography of the dental apices and inferior alveolar nerve to avoiddamaging them when inserting the screw. The aim of this study is to determine the thickness of buccal cortical plate and that of buccal bone at the parasymphysis and mandibular body, thereby determining the area that provide afirm anchorage and the maximum length of mono-cortical screws that can be safely placed in these regions without injuring the tooth roots or mandibular nerve. Materials and Methods:The sample of the present study was 110 Iraqi sub
... Show More
Multipoint forming process is an engineering concept which means that the working surface of the punch and die is produced as hemispherical ends of individual active elements (called pins), where each pin can be independently, vertically displaced using a geometrically reconfigurable die. Several different products can be made without changing tools saved precious production time. Also, the manufacturing of very expensive rigid dies is reduced, and a lot of expenses are saved. But the most important aspects of using such types of equipment are the flexibility of the tooling. This paper presents an experimental investigation of the effect of three main parameters which are blank holder, rubber thickness and forming speed th
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show More