This paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parameters (α, r) and one scale parameter (λ) which is considered known. So, this paper defines the p.d.f. and CDF and derives its Moments formula about origin, and also derive the Moments estimators of two shapes parameters (α, r) in addition to maximum likelihood estimators as well as percentile estimators, the scale parameter (λ) is not estimated (as it is considered known). The comparison between three methods is done through simulation procedure taking different sample size (n=30, 60, 90) and different sets of initial values for (α, r, λ).It is observed that the moment estimators are the best estimator with percentage (46%) ,(42%) respectively compared with other estimators.
In this paper, we derived an estimators and parameters of Reliability and Hazard function of new mix distribution ( Rayleigh- Logarithmic) with two parameters and increasing failure rate using Bayes Method with Square Error Loss function and Jeffery and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived of Bayesian estimator compared to the to the Maximum Likelihood of this function using Simulation technique by Monte Carlo method under different Rayleigh- Logarithmic parameter and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator in all sample sizes with application
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreThe problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreMost of the Weibull models studied in the literature were appropriate for modelling a continuous random variable which assumes the variable takes on real values over the interval [0,∞]. One of the new studies in statistics is when the variables take on discrete values. The idea was first introduced by Nakagawa and Osaki, as they introduced discrete Weibull distribution with two shape parameters q and β where 0 < q < 1 and b > 0. Weibull models for modelling discrete random variables assume only non-negative integer values. Such models are useful for modelling for example; the number of cycles to failure when components are subjected to cyclical loading. Discrete Weibull models can be obta
... Show MoreIn this research, the focus was on estimating the parameters on (min- Gumbel distribution), using the maximum likelihood method and the Bayes method. The genetic algorithmmethod was employed in estimating the parameters of the maximum likelihood method as well as the Bayes method. The comparison was made using the mean error squares (MSE), where the best estimator is the one who has the least mean squared error. It was noted that the best estimator was (BLG_GE).
Abstract
The Phenomenon of Extremism of Values (Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were
... Show MorePicasso ceramics represented illuminated sign in ceramic art and excelled in accord ceramic art dimension aesthetically, and put it in a new prospects, despite the simplicity of the forms turn into a magical images and multiple interpretations.
So the search deliberately to choose purposive (37) samples divided into four groups, as follows: -
A flat shapes / palets or saucers / the vases /modified vases .
benefiting from indicators were spawned from literature ,to analyzing samples within the totals for the identification systems act forming art work`s:-
(1)Picasso's ceramic work product of a deliberate process represented a capacity of technical experience, and formal
(2)The system configuration in the ceramic art works c
The control charts are one of the scientific technical statistics tools that will be used to control of production and always contained from three lines central line and upper, lower lines to control quality of production and represents set of numbers so finally the operating productivity under control or nor than depending on the actual observations. Some times to calculating the control charts are not accurate and not confirming, therefore the Fuzzy Control Charts are using instead of Process Control Charts so this method is more sensitive, accurate and economically for assisting decision maker to control the operation system as early time. In this project will be used set data fr
... Show More