In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collected from the UCI Machine Learning Repository. The findings were discussed and summarized at the end. All calculations for this research have been done using R software (version 4.2.2). © 2024 Author(s).
Abstract
This research aims to design a multi-objective mathematical model to assess the project quality based on three criteria: time, cost and performance. This model has been applied in one of the major projects formations of the Saad Public Company which enables to completion the project on time at an additional cost that would be within the estimated budget with a satisfactory level of the performance which match with consumer requirements. The problem of research is to ensure that the project is completed with the required quality Is subject to constraints, such as time, cost and performance, so this requires prioritizing multiple goals. The project
... Show MoreCloth simulation and animation has been the topic of research since the mid-80's in the field of computer graphics. Enforcing incompressible is very important in real time simulation. Although, there are great achievements in this regard, it still suffers from unnecessary time consumption in certain steps that is common in real time applications. This research develops a real-time cloth simulator for a virtual human character (VHC) with wearable clothing. This research achieves success in cloth simulation on the VHC through enhancing the position-based dynamics (PBD) framework by computing a series of positional constraints which implement constant densities. Also, the self-collision and collision wit
... Show MoreThe purpose of this research is to find the estimator of the average proportion of defectives based on attribute samples. That have been curtailed either with rejection of a lot finding the kth defective or with acceptance on finding the kth non defective.
The MLE (Maximum likelihood estimator) is derived. And also the ASN in Single Curtailed Sampling has been derived and we obtain a simplified Formula All the Notations needed are explained.
In this research work an attempt has been made to investigate about the Robustness of the Bayesian Information criterion to estimate the order of the autoregressive process when the error of this model, Submits to a specific distributions and different cases of the time series on various size of samples by using the simulation, This criterion has been studied by depending on ten distributions, they are (Normal, log-Normal, continues uniform, Gamma , Exponential, Gamble, Cauchy, Poisson, Binomial, Discrete uniform) distributions, and then it has been reached to many collection and recommendations related to this object , when the series residual variable is subject to each ( Poisson , Binomial , Exponential , Dis
... Show MoreIn this paper, some Bayes estimators of the reliability function of Gompertz distribution have been derived based on generalized weighted loss function. In order to get a best understanding of the behaviour of Bayesian estimators, a non-informative prior as well as an informative prior represented by exponential distribution is considered. Monte-Carlo simulation have been employed to compare the performance of different estimates for the reliability function of Gompertz distribution based on Integrated mean squared errors. It was found that Bayes estimators with exponential prior information under the generalized weighted loss function were generally better than the estimators based o
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreWe have investigated in this research, the contents of the electronic cigarette (Viber) and the emergence of the phenomenon of electronic smoking (vibing) were discussed, although the topic of smoking is one of the oldest topics on which many articles and research have been conducted, electronic smoking has not been studied according to statistical scientific research, we tried in this research to identify the concept of electronic smoking to sample the studied data and to deal with it in a scientific way. This research included conducting a statistical analysis using the factor analysis of a sample taken randomly from some colleges in Bab Al-medium in Baghdad with a size of (70) views where (КМО) and a (bartlett) tests
... Show More