In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collected from the UCI Machine Learning Repository. The findings were discussed and summarized at the end. All calculations for this research have been done using R software (version 4.2.2). © 2024 Author(s).
A Tonido cloud server provides a private cloud storage solution and synchronizes customers and employees with the required cloud services over the enterprise. Generally, access to any cloud services by users is via the Internet connection, which can face some problems, and then users may encounter in accessing these services due to a weak Internet connection or heavy load sometimes especially with live video streaming applications overcloud. In this work, flexible and inexpensive proposed accessing methods are submitted and implemented concerning real-time applications that enable users to access cloud services locally and regionally. Practically, to simulate our network connection, we proposed to use the Raspberry-pi3 m
... Show MoreIn this study, cloud point extraction combined with molecular spectrometry as an eco-friendly method is used for extraction, enrichment and determination of bendiocarb (BC) insecticide in different complex matrices. The method involved an alkaline hydrolysis of BC followed Emerson reaction in which the resultant phenol is reacted with 4-aminoantipyrene(4-AAP) in the presence of an alkaline oxidant of potassium ferric cyanide to form red colored product which then extracted into micelles of Triton X-114 as a mediated extractant at room temperature. The extracted product in cloud point layer is separated from the aqueous layer by centrifugation for 20 min and dissolved in a minimum amount of a mixture ethanol: water (1:1) followed
... Show MoreRealizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost
... Show MoreThis research basically gives an introduction about the multiple intelligence
theory and its implication into the classroom. It presents a unit plan based upon the
MI theory followed by a report which explains the application of the plan by the
researcher on the first class student of computer department in college of sciences/
University of Al-Mustansiryia and the teacher's and the students' reaction to it.
The research starts with a short introduction about the MI theory is a great
theory that could help students to learn better in a relaxed learning situation. It is
presented by Howard Gardener first when he published his book "Frames of
Minds" in 1983 in which he describes how the brain has multiple intelligen
Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show MoreThe Log-Logistic distribution is one of the important statistical distributions as it can be applied in many fields and biological experiments and other experiments, and its importance comes from the importance of determining the survival function of those experiments. The research will be summarized in making a comparison between the method of maximum likelihood and the method of least squares and the method of weighted least squares to estimate the parameters and survival function of the log-logistic distribution using the comparison criteria MSE, MAPE, IMSE, and this research was applied to real data for breast cancer patients. The results showed that the method of Maximum likelihood best in the case of estimating the paramete
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
This research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show More