The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the analysis of income inequality and wealth distribution using the Dagum model.
This research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show MoreIt has been shown in ionospheric research that calculation of the total electron content (TEC) is an important factor in global navigation system. In this study, TEC calculation was performed over Baghdad city, Iraq, using a combination of two numerical methods called composite Simpson and composite Trapezoidal methods. TEC was calculated using the line integral of the electron density derived from the International reference ionosphere IRI2012 and NeQuick2 models from 70 to 2000 km above the earth surface. The hour of the day and the day number of the year, R12, were chosen as inputs for the calculation techniques to take into account latitudinal, diurnal and seasonal variation of TEC. The results of latitudinal variation of TE
... Show MoreThe research aims to analysis of the current financial crisis in Iraq through knowing its causes and then propose some solutions that help in remedy the crisis and that on the level of expenditures and revenues, and has been relying on the Federal general budget law of the Republic of Iraq for the fiscal year 2016 to obtain the necessary data in respect of the current expenditures and revenues which necessary to achieve the objective of the research , and through the research results has been reached to a set of conclusions which the most important of them that causes of the current financial crisis in Iraq , mainly belonging to increased expenditures and especially the current ones and the lack of revenues , especially non-oil o
... Show MoreThere is confusion between the concept of honesty and credibility arguing that their meaning is the same. ‘Credibility; is derived from the truth which means evidence of honesty, while ‘honesty’ means not lying and matching reality. The study of credibility begins globally at the end of the fifties of the second millennium to see the decline and refrain from reading newspapers, while it was studied in the Arab world in 1987. Global studies find several meanings of the concept of ‘credibility’ such as accuracy, completeness, transfer facts, impartiality, balance, justice, objectivity, trust, honesty, respect the freedom of individuals and community, and taking into account the traditions and norms.
Credibility has two dimens
Often times, especially in practical applications, it is difficult to obtain data that is not tainted by a problem that may be related to the inconsistency of the variance of error or any other problem that impedes the use of the usual methods represented by the method of the ordinary least squares (OLS), To find the capabilities of the features of the multiple linear models, This is why many statisticians resort to the use of estimates by immune methods Especially with the presence of outliers, as well as the problem of error Variance instability, Two methods of horsepower were adopted, they are the robust weighted least square(RWLS)& the two-step robust weighted least square method(TSRWLS), and their performance was verifie
... Show MoreIn this paper, the deterministic and the stochastic models are proposed to study the interaction of the Coronavirus (COVID-19) with host cells inside the human body. In the deterministic model, the value of the basic reproduction number determines the persistence or extinction of the COVID-19. If , one infected cell will transmit the virus to less than one cell, as a result, the person carrying the Coronavirus will get rid of the disease .If the infected cell will be able to infect all cells that contain ACE receptors. The stochastic model proves that if are sufficiently large then maybe give us ultimate disease extinction although , and this facts also proved by computer simulation.
<p>Currently, breast cancer is one of the most common cancers and a main reason of women death worldwide particularly in<strong> </strong>developing countries such as Iraq. our work aims to predict the type of tumor whether benign or malignant through models that were built using logistic regression and neural networks and we hope it will help doctors in detecting the type of breast tumor. Four models were set using binary logistic regression and two different types of artificial neural networks namely multilayer perceptron MLP and radial basis function RBF. Evaluation of validated and trained models was done using several performance metrics like accuracy, sensitivity, specificity, and AUC (area under receiver ope
... Show More