This paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parameters (α, r) and one scale parameter (λ) which is considered known. So, this paper defines the p.d.f. and CDF and derives its Moments formula about origin, and also derive the Moments estimators of two shapes parameters (α, r) in addition to maximum likelihood estimators as well as percentile estimators, the scale parameter (λ) is not estimated (as it is considered known). The comparison between three methods is done through simulation procedure taking different sample size (n=30, 60, 90) and different sets of initial values for (α, r, λ).It is observed that the moment estimators are the best estimator with percentage (46%) ,(42%) respectively compared with other estimators.
Discussed the research variables are important, privatization options and strategic analysis of the external environment, and that the purpose of the research is the trade-off between privatization options and choose the most appropriate alternative in proportion to the external environment, the research aims to determine the privatization the most appropriate option for companies and public contracting, showing the importance of the study provide the privatization of public companies as a strategy can all its way public sector organizations from the transfer of work practices or private sector organizations and mechanisms to it as contributing to improving the level of skills Develop the current and future level of performance,
... Show MoreMost companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show MoreThe acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.
Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks
... Show MoreGeotechnical characterization of the sites has been investigated with the collection of borehole data from different sources. Using the data, grain size distribution curves have been developed to understand the particle size distribution of the alluvium present. These curves were further used for preliminary assessment of liquefiable areas. From geotechnical characterization, it has been observed that the soil profile in the two sites is dominated by sand and silty sand.Seed and Idriss (1971) approachhas been usedevaluatethe liquefaction potentialbydeterminationof the relation between the maximum ground acceleration (a max/g) valuesdue to an earthquake and the relative density of a sand deposit in the field. The results reveal that
... Show MoreIn this paper, two meshless methods have been introduced to solve some nonlinear problems arising in engineering and applied sciences. These two methods include the operational matrix Bernstein polynomials and the operational matrix with Chebyshev polynomials. They provide an approximate solution by converting the nonlinear differential equation into a system of nonlinear algebraic equations, which is solved by using
In this paper, two meshless methods have been introduced to solve some nonlinear problems arising in engineering and applied sciences. These two methods include the operational matrix Bernstein polynomials and the operational matrix with Chebyshev polynomials. They provide an approximate solution by converting the nonlinear differential equation into a system of nonlinear algebraic equations, which is solved by using
this paper presents a novel method for solving nonlinear optimal conrol problems of regular type via its equivalent two points boundary value problems using the non-classical
In light of the development in computer science and modern technologies, the impersonation crime rate has increased. Consequently, face recognition technology and biometric systems have been employed for security purposes in a variety of applications including human-computer interaction, surveillance systems, etc. Building an advanced sophisticated model to tackle impersonation-related crimes is essential. This study proposes classification Machine Learning (ML) and Deep Learning (DL) models, utilizing Viola-Jones, Linear Discriminant Analysis (LDA), Mutual Information (MI), and Analysis of Variance (ANOVA) techniques. The two proposed facial classification systems are J48 with LDA feature extraction method as input, and a one-dimen
... Show MoreDesign sampling plan was and still one of most importance subjects because it give lowest cost comparing with others, time live statistical distribution should be known to give best estimators for parameters of sampling plan and get best sampling plan.
Research dell with design sampling plan when live time distribution follow Logistic distribution with () as location and shape parameters, using these information can help us getting (number of groups, sample size) associated with reject or accept the Lot
Experimental results for simulated data shows the least number of groups and sample size needs to reject or accept the Lot with certain probability of
... Show More