Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining and learning algorithms. Data mining algorithms are modified to accept the aggregated data as input. Hierarchical data aggregation serves as a paradigm under which novel …
The researcher focused on the importance of the physical abilities of the tennis game, as this game is one of the games that are characterized by its specificity in performance as this game is characterized by continuous movement and dealing with different elements, so this game requires the development of muscle strength, which plays an important role in Performance skills in the game of tennis. There are several methods to develop strength, including flat hierarchical technique, which is one of the most common forms of training in the development of muscle strength. As for the research problem, the researcher found a method that has an effect on the development of force. Therefore, the researcher tried to diversify a
... Show MoreThe purpose of this research is defining the main factors influencing on decision of management system on sensitive data in cloud. The framework is proposed to enhance management information systems decision on sensitive information in cloud environment. The structured interview with several security experts working on cloud computing security to investigate the main objective of framework and suitability of instrument, a pilot study conducts to test the instrument. The validity and reliability test results expose that study can be expanded and lead to final framework validation. This framework using multilevel related to Authorization, Authentication, Classification and identity anonymity, and save and verify, to enhance management
... Show MoreSurvival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete
... Show MoreThis study aim to identify the concept of web based information systems since its one of the important topics that is usually omitted by our organizations, in addition to, designing a web based information system in order to manage the customers data of Al- Rasheed bank, as a unified information system that is specialized to the banking deals of the customers with the bank, and providing a suggested model to apply the virtual private network as a tool that is to protect the transmitted data through the web based information system.
This study is considered important because it deals with one of the vital topics nowadays, namely: how to make it possible to use a distributed informat
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThe regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc
... Show MoreAbstract. Hassan FM, Mahdi WM, Al-Haideri HH, Kamil DW. 2022. Identification of new species record of Cyanophyceae in Diyala River, Iraq based on 16S rRNA sequence data. Biodiversitas 23: 5239-5246. The biodiversity and water quality of the Diyala River require screening water in terms of biological contamination, because it is the only water source in Diyala City and is used for many purposes. This study aimed to identify a new species record of Cynaophyceae and emphasize the importance of using molecular methods beside classic morphological approaches, particularly in the water-shrinkage-aqua system. Five different sites along Diyala River were selected for Cyanophyceae identification. Morphological examination and 16S rRNA sequen
... Show More—This paper studies the control motion of a single link flexible joint robot by using a hierarchical non-singular terminal sliding mode controller (HNTSMC). In comparison to the conventional sliding mode controller (CSMC), the proposed algorithm (NTSMC) not only can conserve characteristics of the convention CSMC, such as easy implementation, guaranteed stability and good robustness against system uncertainties and external disturbances, but also can ensure a faster convergence rate of the systems states to zero in a finite time and singularity free. The flexible joint robot (FJR) is a two degree of freedom (2DOF) nonlinear and underactuated system. The system here is modeled as a fourth order system by using Lagrangian method. Based on t
... Show MoreIn this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreThe aggregation capacity of human reb blood cells lies between that of the non- aggregated arythrocyte and the remarkably full sedimentation. As the ability to aggregate is atributed to many factors such as the availability of macromolecules and plasma lipids, the role of plasm lipid profile on RBC aggregation and sedimentation changes in normal and diabetic patients is studied.Also serum lipid profile measurement (Total cholesterol, Triglyceride, HDL, LDL, VLDL) in normal and diabetic subjects were made. The principle of measurement includes detecting the transmitted laser light through a suspension of 10% diluted red blood cells in plasma. In all diabetics, the raulux formation and sedimentation rate is enhanced.