Proposition of New Ensemble Data-Intelligence Models for Surface Water Quality Prediction
...Show More Authors
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreAbstract A descriptive (cross sectional) study was conducted to assess psychosocial domain of quality of life for (100) women who had hysterectomy for non malignant indications during 6-12 months post operative. The study carried out in both consultation clinics of Al-Elwiya Maternity Hospital and Baghdad Teaching Hospital from January 5th 2003 to July 10th 2003). The results of the study show that hysterectomy achieved a highly successful outcome in terms of psychological and social adjustments for hysterectomies women, a highly significant differences between quality of life (QoL) and some of demographic cha
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreObjective: To determine the quality assurance for maternal and child health care services in Baghdad City.
Methodology: A descriptive study is conducted throughout the period of November 28th 2008 to October 10th
2009. A simple random sample of (349) is selected through the use of probability sampling approach. The study
sample was divided into four groups which include (220) consumers, (35) medical staff, (72) nursing staff and (22)
organization structure (primary health care centers). Data were collected through the use of assessment tools. It was
comprised of four questionnaires and overall items included in these questionnaires are (116) items. The study
included assessment of organization structure. Data were colle
This research includes the using of statistical to improve the quality of can plastics which is produced at the state company for Vegetable oils (Almaamon factory ) by using the percentage defective control chart ( p-chart ) of a fixed sample. A sample of size (450) cans daily for (30) days was selected to determine the rejected product . Operations research with a (win QSB ) package for ( p-chart ) was used to determine test quality level required for product specification to justify that the process that is statistically controlled.
The results show high degree of accuracy by using the program and the mathematical operations (primary and secondary ) which used to draw the control limits charts and to reject the statistically uncontr
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreThe research aims to identify ways of upgrading the quality level of university education at the Middle Technical University in light of its application for the National Ranking project for the quality of Iraqi universities in order to obtain advanced grades among the Iraqi universities , Which is qualified to enter the Ranking of universities worldwide, through displaying the mechanism of the Application of National Ranking project for the quality of Iraqi universities in the Middle Technical University and its formations consisting of (5) technical colleges and (11) technical institute.
The results of the application showed several observations: The most
... Show More