Nuclear emission rates for nucleon-induced reactions are theoretically calculated based on the one-component exciton model that uses state density with non-Equidistance Spacing Model (non-ESM). Fair comparison is made from different state density values that assumed various degrees of approximation formulae, beside the zeroth-order formula corresponding to the ESM. Calculations were made for 96Mo nucleus subjected to (N,N) reaction at Emax=50 MeV. The results showed that the non-ESM treatment for the state density will significantly improve the emission rates calculated for various exciton configurations. Three terms might suffice a proper calculation, but the results kept changing even for ten terms. However, five terms is found to give the most appropriate conditions for calculation time and accuracy
Background: Background : Patients with non-rheumatic atrial fibrillation have high risk of thromboembolism especially ischemic stroke usually arising from left atrial appendage .Transoesophageal echocardiography provides useful information for risk stratification in these patients as it detects thrombus in the left atrial or left atrial appendage. Objective : This study was conducted at Al-Kadhimiya Teaching Hospital to assess the prevalence of left atrial chamber thrombi in patients with chronic non-rheumatic atrial fibrillation using transoesophageal echocardiography and its clinical significance as well as to verify the superiority of transoesophageal over transthoracic echocardiography in the detection of these abnormalities. Type of
... Show MoreThis study relates to the estimation of a simultaneous equations system for the Tobit model where the dependent variables ( ) are limited, and this will affect the method to choose the good estimator. So, we will use new estimations methods different from the classical methods, which if used in such a case, will produce biased and inconsistent estimators which is (Nelson-Olson) method and Two- Stage limited dependent variables(2SLDV) method to get of estimators that hold characteristics the good estimator .
That is , parameters will be estim
... Show More
Abstract:
The models of time series often suffer from the problem of the existence of outliers that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of
... Show MoreThe research aims to analysis of the current financial crisis in Iraq through knowing its causes and then propose some solutions that help in remedy the crisis and that on the level of expenditures and revenues, and has been relying on the Federal general budget law of the Republic of Iraq for the fiscal year 2016 to obtain the necessary data in respect of the current expenditures and revenues which necessary to achieve the objective of the research , and through the research results has been reached to a set of conclusions which the most important of them that causes of the current financial crisis in Iraq , mainly belonging to increased expenditures and especially the current ones and the lack of revenues , especially non-oil o
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreIn this paper we proposed a new method for selecting a smoothing parameter in kernel estimator to estimate a nonparametric regression function in the presence of missing values. The proposed method is based on work on the golden ratio and Surah AL-E-Imran in the Qur'an. Simulation experiments were conducted to study a small sample behavior. The results proved the superiority the proposed on the competition method for selecting smoothing parameter.
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show More