In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collected from the UCI Machine Learning Repository. The findings were discussed and summarized at the end. All calculations for this research have been done using R software (version 4.2.2). © 2024 Author(s).
In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreEach phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreThis study aims to estimate the accuracy of digital elevation models (DEM) which are created with exploitation of open source Google Earth data and comparing with the widely available DEM datasets, Shuttle Radar Topography Mission (SRTM), version 3, and Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), version 2. The GPS technique is used in this study to produce digital elevation raster with a high level of accuracy, as reference raster, compared to the DEM datasets. Baghdad University, Al Jadriya campus, is selected as a study area. Besides, 151 reference points were created within the study area to evaluate the results based on the values of RMS.Furthermore, th
... Show More<p><span>This research deals with the feasibility of a mobile robot to navigate and discover its location at unknown environments, and then constructing maps of these navigated environments for future usage. In this work, we proposed a modified Extended Kalman Filter- Simultaneous Localization and Mapping (EKF-SLAM) technique which was implemented for different unknown environments containing a different number of landmarks. Then, the detectable landmarks will play an important role in controlling the overall navigation process and EKF-SLAM technique’s performance. MATLAB simulation results of the EKF-SLAM technique come with better performance as compared with an odometry approach performance in terms of measuring the
... Show MoreThe researcher tackles the most outstanding conditions of experimentation, the importance of the study lies in being helpful to the workers in the field of theatre in general and directors in particular which the conditions of experimentation that should be taken.The study aims at knowing the experimental basis which the director (Sami Abdulhamid) followed in the realization of this.The researcher tackles in the First inquiry the concept of experimentation and the second tackles the conditions of experimentation.In the methodology of study the researcher analyzed the show of the "Othello in the Kitchen" and comes up to the following: 1. the dhows has cone with the nature of the previous shows experienced the methods that were not familia
... Show MoreThis study included the extraction properties of spatial and morphological basins studied using the Soil and Water Assessment Tool (SWAT) model linked to (GIS) to find the amount of sediment and rates of flow that flows into the Haditha reservoir . The aim of this study is determine the amount of sediment coming from the valleys and flowing into the Haditha Dam reservoir for 25 years ago for the period (1985-2010) and its impact on design lifetime of the Haditha Dam reservoir and to determine the best ways to reduce the sediment transport. The result indicated that total amount of sediment coming from all valleys about (2.56 * 106 ton). The maximum annual total sediment load was about (488.22 * 103 ton) in year 1988
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution