A non-parametric kernel method with Bootstrap technology was used to estimate the confidence intervals of the system failure function of the log-normal distribution trace data. These are the times of failure of the machines of the spinning department of the weaving company in Wasit Governorate. Estimating the failure function in a parametric way represented by the method of the maximum likelihood estimator (MLE). The comparison between the parametric and non-parametric methods was done by using the average of Squares Error (MES) criterion. It has been noted the efficiency of the nonparametric methods based on Bootstrap compared to the parametric method. It was also noted that the curve estimation is more realistic and appropriate for the real data.
In this research study failed Annunciation No. 10 for the fourth phase of the pressure of carbon dioxide of the company for Southern Fertilizers and repeated the failures more than once for the same gospel was a detailed study of the gospel included a series tests for properties Mechanical and Structural addition to the tests microscopic and scanning electron microscope shows m This study parameters and a failure Elal well as the existence of an old internal cracks in the metal of the Annunciation
Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The purpose of this article is to improve and minimize noise from the signal by studying wavelet transforms and showing how to use the most effective ones for processing and analysis. As both the Discrete Wavelet Transformation method was used, we will outline some transformation techniques along with the methodology for applying them to remove noise from the signal. Proceeds based on the threshold value and the threshold functions Lifting Transformation, Wavelet Transformation, and Packet Discrete Wavelet Transformation. Using AMSE, A comparison was made between them , and the best was selected. When the aforementioned techniques were applied to actual data that was represented by each of the prices, it became evident that the lift
... Show MoreAbstract
In this study, we compare between the autoregressive approximations (Yule-Walker equations, Least Squares , Least Squares ( forward- backword ) and Burg’s (Geometric and Harmonic ) methods, to determine the optimal approximation to the time series generated from the first - order moving Average non-invertible process, and fractionally - integrated noise process, with several values for d (d=0.15,0.25,0.35,0.45) for different sample sizes (small,median,large)for two processes . We depend on figure of merit function which proposed by author Shibata in 1980, to determine the theoretical optimal order according to min
... Show MoreA new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
In this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.
The aim of the research is to identify the percentage of success and failure of some compound offensive skills in joiner basketball. It was evident that development only occurred though the mastery of the basic single offence skills as well as the ability to perform compound skills accurately and consistently. Not paying enough attention to compound skills leads evidentially to weakness in the athlete's level that in turn leads to mistakes in performance. Six joiner games of the best four teams in Baghdad were filmed and analyzed. The results of analyzing the compound offence skills were as follows: There was some weakness in the athletes' ability in using compound offence skills specially receiving, dribbling and following through that
... Show More