In this paper, some Bayes estimators of the reliability function of Gompertz distribution have been derived based on generalized weighted loss function. In order to get a best understanding of the behaviour of Bayesian estimators, a non-informative prior as well as an informative prior represented by exponential distribution is considered. Monte-Carlo simulation have been employed to compare the performance of different estimates for the reliability function of Gompertz distribution based on Integrated mean squared errors. It was found that Bayes estimators with exponential prior information under the generalized weighted loss function were generally better than the estimators based on Jeffreys prior information.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MorePolarization is an important property of light, which refers to the direction of electric field oscillations. Polarization modulation plays an essential role for polarization encoding quantum key distribution (QKD). Polarization is used to encode photons in the QKD systems. In this work, visible-range polarizers with optimal dimensions based on resonance grating waveguides have been numerically designed and investigated using the COMSOL Multiphysics Software. Two structures have been designed, namely a singlelayer metasurface grating (SLMG) polarizer and an interlayer metasurface grating (ILMG) polarizer. Both structures have demonstrated high extinction ratios, ~1.8·103 and 8.68·104 , and the bandwidths equal to 45 and 55 nm for th
... Show MoreTheoretical spectroscopic study of Beryllium Oxide has been carried out, Boltzmann distribution of P, Q and R branches in the range of (0<J<13) at temperature 4200K for (0-0) band for electronic transitions B1Σ+-A1Π and B1Σ-X1Σ. The Boltzmann distribution of these branches has a maximum values at equal J approximately while the values of relative population are different. For the B1Σ+- X1Σ+ transition the branch's lines extend towards lower wavenumber. This is because (Bv'-Bv") value is negative, i.e Bv'< Bv" For B1Σ+-A1Π
... Show MorePreparation of identical independent photons is the core of many quantum applications such as entanglement swapping and entangling process. In this work, Hong-Ou-Mandel experiment was performed to evaluate the degree of indistinguishability between independent photons generated from two independent weak coherent sources working at 640 nm. The visibility was 46%, close to the theoretical limit of 50%. The implemented setup can be adopted in quantum key distribution experiments carried out with free space as the channel link, as all the devices and components used are operative in the visible range of the electromagnetic spectrum.
Inventory or inventories are stocks of goods being held for future use or sale. The demand for a product in is the number of units that will need to be removed from inventory for use or sale during a specific period. If the demand for future periods can be predicted with considerable precision, it will be reasonable to use an inventory rule that assumes that all predictions will always be completely accurate. This is the case where we say that demand is deterministic.
The timing of an order can be periodic (placing an order every days) or perpetual (placing an order whenever the inventory declines to units).
in this research we discuss how to formulating inv
... Show MoreIn this study, we investigate about the run length properties of cumulative sum (Cusum) and The exponentially weighted moving average (EWMA) control charts, to detect positive shifts in the mean of the process for the poisson distribution with unknown mean. We used markov chain approach to compute the average and the standard deviation for run length for Cusum and EWMA control charts, when the variable under control follows poisson distribution. Also, we used the Cusum and the EWMA control charts for monitoring a process mean when the observations (products are selected from Al_Mamun Factory ) are identically and independently distributed (iid) from poisson distribution i
... Show MoreDiscriminant analysis is a technique used to distinguish and classification an individual to a group among a number of groups based on a linear combination of a set of relevant variables know discriminant function. In this research discriminant analysis used to analysis data from repeated measurements design. We will deal with the problem of discrimination and classification in the case of two groups by assuming the Compound Symmetry covariance structure under the assumption of normality for univariate repeated measures data.
... Show More
Abstract: This study aims to investigate the backscattering electron coefficient for SixGe1-x/Si heterostructure sample as a function of primary electron beam energy (0.25-20 keV) and Ge concentration in the alloy. The results obtained have several characteristics that are as follows: the first one is that the intensity of the backscattered signal above the alloy is mainly related to the average atomic number of the SixGe1-x alloy. The second feature is that the backscattering electron coefficient line scan shows a constant value above each layer at low primary electron energies below 5 keV. However, at 5 keV and above, a peak and a dip appeared on the line scan above Si-Ge alloy and Si, respectively, close to the interfacing line
... Show More