Abstract Planetary nebulae (PN) represents the short phase in the life of stars with masses (0.89-7) M☉. Several physical processes taking place during the red giant phase of low and intermediates-mass stars. These processes include :1) The regular (early ) wind and the envelope ejection, 2) The thermal pulses during Asymptotic Giant Branch (AGB ) phase. In this paper it is briefly discussed how such processes affect the mass range of Planetary Nebulae(PN) nuclei(core) and their evolution, and the PN life time, and fading time for the masses which adopted. The Synthetic model is adopted. The envelope mass of star (MeN ) and transition time (ttr) calculated respectively for the parameter (MeR =1.5,2, 3×10-3 M☉). Another time scale is of capital importance for the understanding of PN and their nuclei, it is the fading time ( tf). The results indicated that for each observed nebulae( ttr < tPN) also the fading time is sensitive to mass core(MH) of star, the mass with 1.2 M☉ takes only (25 yr ) to fading, while the mass with (0.66 M☉) takes about ( 4715 yr) years to fading. The calculations showed that (ttr) increases with the increasing of final mass( Mf). The initial nebulae radius will also increase with (Mf) thus will correlate with the location of nucleus on the HR diagram.
The objective of the research is to clarify the grants and aids as a conceptual input, identify the factors of obtaining them and their objectives, and the statement of the need to produce accounting information that enhances financial reporting related to grants and assistance, especially the presentation of the accounting treatments provided by the unified accounting system and determining the shortcomings of that system, The accounting requirements of IAS20 to limit the variation of treatments with application to the economic unit (the research sample) are presented.
The study reached a set of conclusions, the most important of which is the absence of an accounting base in Iraq that determines the basi
... Show MoreObtaining the computational models for the functioning of the brain gives us a chance to understand the brain functionality thoroughly. This would help the development of better treatments for neurological illnesses and disorders. We created a cortical model using Python language using the Brian simulator. The Brian simulator is specialized in simulating the neuronal connections and synaptic interconnections. The dynamic connection model has multiple parameters in order to ensure an accurate simulation (Bowman, 2016). We concentrated on the connection weights and studied their effect on the interactivity and connectivity of the cortical neurons in the same cortical layer and across multiple layers. As synchronization helps us to mea
... Show More Problem solving methods and mechanisms contribute to facilitating human life by providing tools to solve simple and complex daily problems. These mechanisms have been essential tools for professional designers and design students in solving design problems.
This research dealt with one of those mechanisms, which is the (the substance-field model model), as it has been mentioning that this mechanism is characterized by the difficulty of its application, which formed the main research problem. In home gardens (the sub-problem of research), an analysis of this problem was applied and then a solution was found to address it. The researcher used the 3dsmax program to implement the proposed design.
The most important research res
The city has normal natural state, and the man has a usual movement, change and search for the new .Also, the city has a usual change and transform in its time, place and quality (sizes)structures. The city has a solid memory diving into the past and the future and reflects The real present, and this memory has a timing layers change into real materialistic place making the city has accumulated overlapping circles which is hard to break u , and it broadcasts the lockup timing density ,in which there is no visual record precisely, it is just like((the social record)) that evaluates the un visual relationships between the components and parts of the city (community and form) in a visual quiet exhibition and transform change inside.
... Show MoreThis Paper aims to plan the production of the electrical distribution converter (400 KV/11) for one month at Diyala Public Company and with more than one goal for the decision-maker in a fuzzy environment. The fuzzy demand was forecasting using the fuzzy time series model. The fuzzy lead time for raw materials involved in the production of the electrical distribution converter (400 KV/11) was addressed using the fuzzy inference matrix through the application of the matrix in Matlab, and since the decision-maker has more than one goal, so a mathematical model of goal programming was create, which aims to achieve two goals, the first is to reduce the total production costs of the electrical distribution converter (400 KV/11) and th
... Show MoreThe education sector suffers from many problems, including the scarcity of schools that can absorb the increasing number of students in light of the increasing population growth rate, as some regions suffer from a lack of opening of new schools or the expansion of existing schools to increase their capacity so that attention is required. The research sought to identify the level of maturity of project management at the research site (Building Department in Al-Karkh I/ Ministry of Education) Being responsible for educational projects and their implementation and to know that, the ten areas of the knowledge guide to project management PMBOK have been adopted according to the PM3 model (one of the models of maturity
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time t . The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method t
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.