The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimating the scale parameter of the Weibull distribution. To evaluate their performance, we generate simulated datasets with different sample sizes and varying parameter values. A technique for pre-estimation shrinkage is suggested to enhance the precision of estimation. Simulation experiments proved that the Bayesian shrinkage estimator and shrinkage preestimation under the squared loss function method are better than the other methods because they give the least mean square error. Overall, our findings highlight the advantages of shrinkage Bayesian estimation methods for the proposed distribution. Researchers and practitioners in fields reliant on extreme value analysis can benefit from these findings when selecting appropriate Bayesian estimation techniques for modeling extreme events accurately and efficiently.
this paper presents a novel method for solving nonlinear optimal conrol problems of regular type via its equivalent two points boundary value problems using the non-classical
The study included examination of three types of different origin and orange juice at the rate of recurring per sample, the results showed that the highest rates of acid (pH) in the A and juice were (4). And salts of calcium is 120 ppm in juice C and 86 ppm of magnesium in the juice B, for heavy metals the highest rate of lead .18 recorded ppm in juice B, 1.32 ppm of copper in juice A, 5 ppm of iron in the juice B, 1.3 ppm of zinc in the juice B, 0.05 ppm of aluminum in each of the sappy B and A, 0.02 ppm of cobalt in the juice B, 0.3 ppm of nickel in the juice B, 170.6 ppm sodium in C juice, but for the acids, organic that the highest rates were 3.2 part Millions of acid in the juice owner a, 260 ppm of the acid in the juice the ascorbi
... Show MoreThe aim of this study was to develop a sensor based on a carbon paste electrodes (CPEs) modified with used MIP for determination of organophosphorus pesticides (OPPs). The modified electrode exhibited a significantly increased sensitivity and selectivity of (OPPs). The MIP was prepared by thermo-polymerization method using N,N-diethylaminoethymethacrylate (NNDAA) as functional monomer, N,N-1,4-phenylenediacrylamide (NNPDA) as cross-linker, the acetonitrile used as solvent and (Opps) as the template molecule. The three OPPs (diazinon, quinalphos and chlorpyrifos) were chosen as the templates, which have been selected as base analytes which used widely in agriculture sector. The extraction efficiency of the imprinted polymers has been evaluat
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreMany of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem
... Show MoreGeneralized Additive Model has been considered as a multivariate smoother that appeared recently in Nonparametric Regression Analysis. Thus, this research is devoted to study the mixed situation, i.e. for the phenomena that changes its behaviour from linear (with known functional form) represented in parametric part, to nonlinear (with unknown functional form: here, smoothing spline) represented in nonparametric part of the model. Furthermore, we propose robust semiparametric GAM estimator, which compared with two other existed techniques.
Digital Elevation Model (DEM) is one of the developed techniques for relief representation. The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kriging, IDW (inver
... Show MoreDigital Elevation Model (DEM) is one of the developed techniques for relief representation. The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kri
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show More إن المقصود باختبارات حسن المطابقة هو التحقق من فرضية العدم القائمة على تطابق مشاهدات أية عينة تحت الدراسة لتوزيع احتمالي معين وترد مثل هكذا حالات في التطبيق العملي بكثرة وفي كافة المجالات وعلى الأخص بحوث علم الوراثة والبحوث الطبية والبحوث الحياتية ,عندما اقترح كلا من Shapiro والعالم Wilk عام 1965 اختبار حسن المطابقة الحدسي مع معالم القياس
(