Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes for juveniles in Iraq, specifically the Baghdad governorate, and the risk ratios about those crimes for the years 2008-2018, with a sample size of (128) (Sureshrink) The study also showed an increase in the rate of theft crimes for juveniles in recent years.
Bootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show MoreWe have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show MoreThe Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreThis research dealt with the analysis of murder crime data in Iraq in its temporal and spatial dimensions, then it focused on building a new model with an algorithm that combines the characteristics associated with time and spatial series so that this model can predict more accurately than other models by comparing them with this model, which we called the Combined Regression model (CR), which consists of merging two models, the time series regression model with the spatial regression model, and making them one model that can analyze data in its temporal and spatial dimensions. Several models were used for comparison with the integrated model, namely Multiple Linear Regression (MLR), Decision Tree Regression (DTR), Random Forest Reg
... Show MoreIn this paper, we will discuss the performance of Bayesian computational approaches for estimating the parameters of a Logistic Regression model. Markov Chain Monte Carlo (MCMC) algorithms was the base estimation procedure. We present two algorithms: Random Walk Metropolis (RWM) and Hamiltonian Monte Carlo (HMC). We also applied these approaches to a real data set.
In this paper, we establish the conditions of the occurrence of the local bifurcations, such as saddle node, transcritical and pitchfork, of all equilibrium points of an eco-epidemiological model consisting of a prey-predator model with SI (susceptible-infected) epidemic diseases in prey population only and a refuge-stage structure in the predators. It is observed that there is a transcritical bifurcation near the axial and free predator equilibrium points, near disease-free equilibrium point is a saddle-node bifurcation and near positive (coexistence) equilibrium point is a saddle-node bifurcation, a transcritical bifurcation and a pitchfork bifurcation. Further investigations for Hopf bifurcation near coexistence equilibrium point
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThe logistic regression model is one of the oldest and most common of the regression models, and it is known as one of the statistical methods used to describe and estimate the relationship between a dependent random variable and explanatory random variables. Several methods are used to estimate this model, including the bootstrap method, which is one of the estimation methods that depend on the principle of sampling with return, and is represented by a sample reshaping that includes (n) of the elements drawn by randomly returning from (N) from the original data, It is a computational method used to determine the measure of accuracy to estimate the statistics, and for this reason, this method was used to find more accurate estimates. The ma
... Show More