The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree model. Having been in this research compare these methods form a model for additive function to some nonparametric function. It was a trade-off between these process models based on the classification accuracy by misclassification error, and estimation accuracy by the root of the mean squares error: RMSE. It was the application on patients with diabetes data for those aged 15 years and below are taken from the sample size (200) was withdrawn from the Children Hospital in Al-Eskan / Baghdad.
Abstract
This work involves the manufacturing of MAX phase materials include V2AlC and Cr2AlC using powder metallurgy as a new class of materials which characterized by regular crystals in lattice. Corrosion behavior of these materials was investigated by Potentiostat to estimate corrosion resistance and compared with the most resistant material represented by SS 316L. The experiments were carried out in 0.01N of NaOH solution at four temperatures in the range of 30–60oC. Polarization resistance values which calculated by Stern-Geary equation indicated that the MAX phase materials more resistant than SS 316L. Also cyclic polarization tests confirme
... Show MoreThrough this research, We have tried to evaluate the health programs and their effectiveness in improving the health situation through a study of the health institutions reality in Baghdad to identify the main reasons that affect the increase in maternal mortality by using two regression models, "Poisson's Regression Model" and "Hierarchical Poisson's Regression Model". And the study of that indicator (deaths) was through a comparison between the estimation methods of the used models. The "Maximum Likelihood" method was used to estimate the "Poisson's Regression Model"; whereas the "Full Maximum Likelihood" method were used for the "Hierarchical Poisson's Regression Model
... Show MoreThe financial markets are one of the sectors whose data is characterized by continuous movement in most of the times and it is constantly changing, so it is difficult to predict its trends , and this leads to the need of methods , means and techniques for making decisions, and that pushes investors and analysts in the financial markets to use various and different methods in order to reach at predicting the movement of the direction of the financial markets. In order to reach the goal of making decisions in different investments, where the algorithm of the support vector machine and the CART regression tree algorithm are used to classify the stock data in order to determine
... Show MoreGender classification is a critical task in computer vision. This task holds substantial importance in various domains, including surveillance, marketing, and human-computer interaction. In this work, the face gender classification model proposed consists of three main phases: the first phase involves applying the Viola-Jones algorithm to detect facial images, which includes four steps: 1) Haar-like features, 2) Integral Image, 3) Adaboost Learning, and 4) Cascade Classifier. In the second phase, four pre-processing operations are employed, namely cropping, resizing, converting the image from(RGB) Color Space to (LAB) color space, and enhancing the images using (HE, CLAHE). The final phase involves utilizing Transfer lea
... Show MoreThis article aims to estimate the partially linear model by using two methods, which are the Wavelet and Kernel Smoothers. Simulation experiments are used to study the small sample behavior depending on different functions, sample sizes, and variances. Results explained that the wavelet smoother is the best depending on the mean average squares error criterion for all cases that used.
Gray-Scale Image Brightness/Contrast Enhancement with Multi-Model
Histogram linear Contrast Stretching (MMHLCS) method
The internet, unlike other traditional means of communication, has a flexibility to stimulate the user and allows him to develop it. Perhaps, the reason for the superiority of the internet over other traditional means of communication is the possibility of change and transmission from one stage to another in a short period. This means that the internet is able to move from the use to the development of the use and then the development of means and innovation as the innovation of the internet is a logical product of the interaction of the user with the network. The internet invests all the proposals and ideas and does not ignore any even if it is simple. This is represented in social networking sites which in fact reflects personal emotio
... Show More
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show More. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.