Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes for juveniles in Iraq, specifically the Baghdad governorate, and the risk ratios about those crimes for the years 2008-2018, with a sample size of (128) (Sureshrink) The study also showed an increase in the rate of theft crimes for juveniles in recent years.
In this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
The research aims to determine the mix of production optimization in the case of several conflicting objectives to be achieved at the same time, therefore, discussions dealt with the concept of programming goals and entrances to be resolved and dealt with the general formula for the programming model the goals and finally determine the mix of production optimization using a programming model targets to the default case.
In this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreAbstract:
In this research we discussed the parameter estimation and variable selection in Tobit quantile regression model in present of multicollinearity problem. We used elastic net technique as an important technique for dealing with both multicollinearity and variable selection. Depending on the data we proposed Bayesian Tobit hierarchical model with four level prior distributions . We assumed both tuning parameter are random variable and estimated them with the other unknown parameter in the model .Simulation study was used for explain the efficiency of the proposed method and then we compared our approach with (Alhamzwi 2014 & standard QR) .The result illustrated that our approach
... Show MoreAims current research to identify the mistakes coding contained in the reading first grade. Encoding knew that he had failed in Retrieval or identifying information, the researcher diagnosis of mistakes and presented to a group of teachers first grade and they have an appropriate adjustment and using the percentage shows that the agreement on the mistakes ratio and adjusted researcher recommended a set of proposals and recommendations can work out the future for the advancement of scientific level
In this paper, an efficient method for compressing color image is presented. It allows progressive transmission and zooming of the image without need to extra storage. The proposed method is going to be accomplished using cubic Bezier surface (CBI) representation on wide area of images in order to prune the image component that shows large scale variation. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, bi-orthogonal wavelet transform is applied to decompose the residue component. Both scalar quantization and quad tree coding steps are applied on the produced wavelet sub bands. Finally, adaptive shift coding is applied to handle the remaining statistical redundancy and attain e
... Show MoreLong memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreThe last few years witnessed great and increasing use in the field of medical image analysis. These tools helped the Radiologists and Doctors to consult while making a particular diagnosis. In this study, we used the relationship between statistical measurements, computer vision, and medical images, along with a logistic regression model to extract breast cancer imaging features. These features were used to tell the difference between the shape of a mass (Fibroid vs. Fatty) by looking at the regions of interest (ROI) of the mass. The final fit of the logistic regression model showed that the most important variables that clearly affect breast cancer shape images are Skewness, Kurtosis, Center of mass, and Angle, with an AUCROC of
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show More