Abstract Planetary nebulae (PN) represents the short phase in the life of stars with masses (0.89-7) M☉. Several physical processes taking place during the red giant phase of low and intermediates-mass stars. These processes include :1) The regular (early ) wind and the envelope ejection, 2) The thermal pulses during Asymptotic Giant Branch (AGB ) phase. In this paper it is briefly discussed how such processes affect the mass range of Planetary Nebulae(PN) nuclei(core) and their evolution, and the PN life time, and fading time for the masses which adopted. The Synthetic model is adopted. The envelope mass of star (MeN ) and transition time (ttr) calculated respectively for the parameter (MeR =1.5,2, 3×10-3 M☉). Another time scale is o
... Show MoreProductivity estimating of ready mixed concrete batch plant is an essential tool for the successful completion of the construction process. It is defined as the output of the system per unit of time. Usually, the actual productivity values of construction equipment in the site are not consistent with the nominal ones. Therefore, it is necessary to make a comprehensive evaluation of the nominal productivity of equipment concerning the effected factors and then re-evaluate them according to the actual values.
In this paper, the forecasting system was employed is an Artificial Intelligence technique (AI). It is represented by Artificial Neural Network (ANN) to establish the predicted model to estimate wet ready mixe
... Show MoreThis study included the extraction properties of spatial and morphological basins studied using the Soil and Water Assessment Tool (SWAT) model linked to (GIS) to find the amount of sediment and rates of flow that flows into the Haditha reservoir . The aim of this study is determine the amount of sediment coming from the valleys and flowing into the Haditha Dam reservoir for 25 years ago for the period (1985-2010) and its impact on design lifetime of the Haditha Dam reservoir and to determine the best ways to reduce the sediment transport. The result indicated that total amount of sediment coming from all valleys about (2.56 * 106 ton). The maximum annual total sediment load was about (488.22 * 103 ton) in year 1988
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIn this paper, the bowtie method was utilized by a multidisciplinary team in the Federal Board of Supreme Audit (FBSA)for the purpose of managing corruption risks threatening the Iraqi construction sector. Corruption in Iraq is a widespread phenomenon that threatens to degrade society and halt the wheel of economic development, so it must be reduced through appropriate strategies. A total of eleven corruption risks have been identified by the involved parties in corruption and were analyzed by using probability and impact matrix and their priority has been ranked. Bowtie analysis was conducted on four factors with high score risk in causing corruption in the planning stage. The number and effectiveness of the existing proactive meas
... Show MoreA novel design and implementation of a cognitive methodology for the on-line auto-tuning robust PID controller in a real heating system is presented in this paper. The aim of the proposed work is to construct a cognitive control methodology that gives optimal control signal to the heating system, which achieve the following objectives: fast and precise search efficiency in finding the on- line optimal PID controller parameters in order to find the optimal output temperature response for the heating system. The cognitive methodology (CM) consists of three engines: breeding engine based Routh-Hurwitz criterion stability, search engine based particle
swarm optimization (PSO) and aggregation knowledge engine based cultural algorithm (CA)
The phenomenon of poverty is one of the most important phenomena facing the world at large. Despite the tremendous technological progress witnessed by mankind and despite the unprecedented high levels of world economic production, poverty remains the greatest challenge facing the world. Statistics and studies have shown that poverty is caused by several problems: (health, social, economic, educational, etc.) These problems are obstacles to the ability to obtain employment opportunities, which leads in the beginning to the growth phenomenon of unemployment, and ultimately to the growth of poverty.
The results of a range of research in the field of psychology have confirmed that children from poor homes suffer from a high level of
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show More