Estimation of the unknown parameters in 2-D sinusoidal signal model can be considered as important and difficult problem. Due to the difficulty to find estimate of all the parameters of this type of models at the same time, we propose sequential non-liner least squares method and sequential robust M method after their development through the use of sequential approach in the estimate suggested by Prasad et al to estimate unknown frequencies and amplitudes for the 2-D sinusoidal compounds but depending on Downhill Simplex Algorithm in solving non-linear equations for the purpose of obtaining non-linear parameters estimation which represents frequencies and then use of least squares formula to estimate linear parameters which represents amplitude . solve non-linear equations using Newton –Raphson method in sequential non-linear least squares method and obtain parameters estimate that represents frequencies and linear parameters which represents amplitude at the same time, and compared this method with sequential robust M method when the signal affected by different types of noise including the normal distribution of the error and the heavy-tailed distributions error, numerical simulation are performed to observe the performance of the estimation methods for different sample size, and various level of variance using a statistical measure of mean square error (MSE), we conclude in general that sequential non-linear least squares method is more efficiency compared to others if we follow the normal and logistic distribution of noise, but if the noise follow Cauchy distribution it was a sequential robust M method based on bi-square weight function is the best in the estimation.
The field of Optical Character Recognition (OCR) is the process of converting an image of text into a machine-readable text format. The classification of Arabic manuscripts in general is part of this field. In recent years, the processing of Arabian image databases by deep learning architectures has experienced a remarkable development. However, this remains insufficient to satisfy the enormous wealth of Arabic manuscripts. In this research, a deep learning architecture is used to address the issue of classifying Arabic letters written by hand. The method based on a convolutional neural network (CNN) architecture as a self-extractor and classifier. Considering the nature of the dataset images (binary images), the contours of the alphabet
... Show MoreIn this paper , two method which deal with finding the optimal value for adaptive smoothing constant, are compared .This constant is used in adaptive Single Exponential Smoothing (ASES).
The comparing is between a method uses time domain and another uses frequency domain when the data contain outlier value for autoregressive model of order one AR(1) , or Markov Model, when the time series are stationary and non stationary with deferent samples .
A seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MoreSimulation Study
Abstract :
Robust statistics Known as, Resistance to mistakes resulting of the deviation of Check hypotheses of statistical properties ( Adjacent Unbiased , The Efficiency of data taken from a wide range of probability distributions follow a normal distribution or a mixture of other distributions with different standard deviations.
power spectrum function lead to, President role in the analysis of Stationary random processes, organized according to time, may be discrete random variables or continuous. Measuring its total capacity as frequency function.
Estimation methods Share with
... Show MoreIn this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the
... Show MoreThe aim of this paper is to evaluate the rate of contamination in soils by using accurate numerical method as a suitable tool to evaluate the concentration of heavy metals in soil. In particular, 2D –interpolation methods are applied in the models of the spread the metals in different direction.The paper illustrates the importance of the numerical method in different applications, especially nvironment contamination. Basically, there are many roles for approximating functions. Thus, the approximating of function namely the analytical expression may be expressed; the most common type being is polynomials, which are the easy implemented and simplest methods of approximation. In this paper the divided difference formula is used and extended
... Show MoreSome nonlinear differential equations with fractional order are evaluated using a novel approach, the Sumudu and Adomian Decomposition Technique (STADM). To get the results of the given model, the Sumudu transformation and iterative technique are employed. The suggested method has an advantage over alternative strategies in that it does not require additional resources or calculations. This approach works well, is easy to use, and yields good results. Besides, the solution graphs are plotted using MATLAB software. Also, the true solution of the fractional Newell-Whitehead equation is shown together with the approximate solutions of STADM. The results showed our approach is a great, reliable, and easy method to deal with specific problems in
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreCanonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show More