Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining and learning algorithms. Data mining algorithms are modified to accept the aggregated data as input. Hierarchical data aggregation serves as a paradigm under which novel …
Background: DVT is a very common problem with a very serious complications like pulmonary embolism (PE) which carries a high mortality,and many other chronic and annoying complications ( like chronic DVT, post-phlebitic syndrome, and chronic venous insufficiency) ,and it has many risk factors that affect its course, severity ,and response to treatment. Objectives: Most of those risk factors are modifiable, and a better understanding of the relationships between them can be beneficial for better assessment for liable pfatients , prevention of disease, and the effectiveness of our treatment modalities. Male to female ratio was nearly equal , so we didn’t discuss the gender among other risk factors. Type of the study:A cross- secti
Recent years have seen an explosion in graph data from a variety of scientific, social and technological fields. From these fields, emotion recognition is an interesting research area because it finds many applications in real life such as in effective social robotics to increase the interactivity of the robot with human, driver safety during driving, pain monitoring during surgery etc. A novel facial emotion recognition based on graph mining has been proposed in this paper to make a paradigm shift in the way of representing the face region, where the face region is represented as a graph of nodes and edges and the gSpan frequent sub-graphs mining algorithm is used to find the frequent sub-structures in the graph database of each emotion. T
... Show MoreThe predilection for 5G telemedicine networks has piqued the interest of industry researchers and academics. The most significant barrier to global telemedicine adoption is to achieve a secure and efficient transport of patients, which has two critical responsibilities. The first is to get the patient to the nearest hospital as quickly as possible, and the second is to keep the connection secure while traveling to the hospital. As a result, a new network scheme has been suggested to expand the medical delivery system, which is an agile network scheme to securely redirect ambulance motorbikes to the nearest hospital in emergency cases. This research provides a secured and efficient telemedicine transport strategy compatible with the
... Show MoreDEMs, thus, simply regular grids of elevation measurements over the land surface.The aim of the present work is to produce high resolution DEM for certain investigated region (i.e. Baghdad University Campus\ college of science). The easting and northing of 90 locations, including the ground-base and buildings of the studied area, have been obtained by field survey using global positioning system (GPS). The image of the investigated area has been extracted from Quick-Bird satellite sensor (with spatial resolution of 0.6 m). It has been geo-referenced and rectified using 1st order polynomial transformation. many interpolation methods have been used to estimate the elevation such as ordinary Kriging, inverse distance weight
... Show MoreIn many applications such as production, planning, the decision maker is important in optimizing an objective function that has fuzzy ratio two functions which can be handed using fuzzy fractional programming problem technique. A special class of optimization technique named fuzzy fractional programming problem is considered in this work when the coefficients of objective function are fuzzy. New ranking function is proposed and used to convert the data of the fuzzy fractional programming problem from fuzzy number to crisp number so that the shortcoming when treating the original fuzzy problem can be avoided. Here a novel ranking function approach of ordinary fuzzy numbers is adopted for ranking of triangular fuzzy numbers with simpler an
... Show MoreWildfire risk has globally increased during the past few years due to several factors. An efficient and fast response to wildfires is extremely important to reduce the damaging effect on humans and wildlife. This work introduces a methodology for designing an efficient machine learning system to detect wildfires using satellite imagery. A convolutional neural network (CNN) model is optimized to reduce the required computational resources. Due to the limitations of images containing fire and seasonal variations, an image augmentation process is used to develop adequate training samples for the change in the forest’s visual features and the seasonal wind direction at the study area during the fire season. The selected CNN model (Mob
... Show MoreThe primary objective of the current paper is to suggest and implement effective computational methods (DECMs) to calculate analytic and approximate solutions to the nonlocal one-dimensional parabolic equation which is utilized to model specific real-world applications. The powerful and elegant methods that are used orthogonal basis functions to describe the solution as a double power series have been developed, namely the Bernstein, Legendre, Chebyshev, Hermite, and Bernoulli polynomials. Hence, a specified partial differential equation is reduced to a system of linear algebraic equations that can be solved by using Mathematica®12. The techniques of effective computational methods (DECMs) have been applied to solve some s
... Show More
Abstract
The Classical Normal Linear Regression Model Based on Several hypotheses, one of them is Heteroscedasticity as it is known that the wing of least squares method (OLS), under the existence of these two problems make the estimators, lose their desirable properties, in addition the statistical inference becomes unaccepted table. According that we put tow alternative, the first one is (Generalized Least Square) Which is denoted by (GLS), and the second alternative is to (Robust covariance matrix estimation) the estimated parameters method(OLS), and that the way (GLS) method neat and certified, if the capabilities (Efficient) and the statistical inference Thread on the basis of an acceptable
... Show More