Recurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning algorithms implementation in the recurrent stroke prediction models. This research aims to investigate and compare the performance of machine learning algorithms using recurrent stroke clinical public datasets. In this study, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Bayesian Rule List (BRL) are used and compared their performance in the domain of recurrent stroke prediction model. The result of the empirical experiments shows that ANN scores the highest accuracy at 80.00%, follows by BRL with 75.91% and SVM with 60.45%.
In the literature, several correlations have been proposed for bubble size prediction in bubble columns. However these correlations fail to predict bubble diameter over a wide range of conditions. Based on a data bank of around 230 measurements collected from the open literature, a correlation for bubble sizes in the homogenous region in bubble columns was derived using Artificial Neural Network (ANN) modeling. The bubble diameter was found to be a function of six parameters: gas velocity, column diameter, diameter of orifice, liquid density, liquid viscosity and liquid surface tension. Statistical analysis showed that the proposed correlation has an Average Absolute Relative Error (AARE) of 7.3 % and correlation coefficient of 92.2%. A
... Show MoreBackground. Dental implantation has become a standard procedure with high success rates, relying on achieving osseointegration between the implant surface and surrounding bone tissue. Polyether ether ketone (PEEK) is a promising alternative to traditional dental implant materials like titanium, but its osseointegration capabilities are limited due to its hydrophobic nature and reduced surface roughness. Objective. The aim of the study is to increase the surface roughness and hydrophilicity of PEEK by treating the surface with piranha solution and then coating the surface with epigallocatechin-3-gallate (EGCG) by electrospraying technique. Materials and Methods. The study includes four groups intended to investigate the effect of pir
... Show MoreThis study delves into the realm of advanced cooling techniques by examining the performance of a two-stage parallel flow indirect evaporative cooling system enhanced with aspen pads in the challenging climate of Baghdad. The objective was to achieve average air dry bulb temperatures (43 oC) below the ambient wet bulb temperatures (24.95 oC) with an average relative humidity of 23%, aiming for unparalleled cooling efficiency. The research experiment was carried out in the urban environment of Baghdad, characterized by high temperature conditions. The investigation focused on the potential of the two-stage parallel flow setup, combined with the cooling capability of aspen pads, to surpass the limitat
... Show MoreThe developments accelerated in technology and rapid changes in the environment and increase numbers industrial countries and different desires and requirements of customers, lead to be produced in large quantities is not feasible due to changes listed above as well as the need to product variety and change in tastes and desires of consumers, all above led not to enable companies to discharge their products in the case of mass production and created the need to devise ways and new methods fit with the current situation, and accounting point no longer the traditional accounting systems able to meet the requirements needed by the companies to make decisions and know where waste and loss of resources resulting to invent new style away from
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreThe aim of this study is to investigate the role of prodigiosin on P. aeruginosa' s biofilm genes involved in the pathogenicity and persistency of the bacteria; Materials and methods: Gram negative bacterial isolates were taken from burn and wounds specimen obtained from some of Baghdad hospitals. Forty six isolates were identified as Pseudomonas aeruginosa and four isolates as Serratia marcescens by using biochemical tests and VITEK 2 compact system. Susceptibility test was performed for all P. aeruginosa isolates, the results showed that 100% were resistant to Amikacin and 98% were sensitive to Meropenem. Resistant isolates were tested for biofilm formation; the strong and moderate isolates (17) were detected by PCR for AlgD gene
... Show More
In this work, a novel technique to obtain an accurate solutions to nonlinear form by multi-step combination with Laplace-variational approach (MSLVIM) is introduced. Compared with the traditional approach for variational it overcome all difficulties and enable to provide us more an accurate solutions with extended of the convergence region as well as covering to larger intervals which providing us a continuous representation of approximate analytic solution and it give more better information of the solution over the whole time interval. This technique is more easier for obtaining the general Lagrange multiplier with reduces the time and calculations. It converges rapidly to exact formula with simply computable terms wit
... Show MoreThe study aimed to identify the use of the electronic concept maps method in learning some of the skills of the floor exercises in the artistic gymnastics for third graders ,as well as to identify the best group between the two research groups (experimental And the officer to learn and retain some of the skills of the floor exercises in the artistic gymnastics of the research subject , and the experimental method was used and included the sample research on students of the collage of Physical Education and Sports Sciences/University of Baghdad, third grade, and has selected (10) Students for each group of The experimental and controlling groups randomly by lottery and after the completion of the period of implementation of the experiment wh
... Show MoreAbstract
The current research aims to find out the role of administrative leadership in the implementation of public policies and their effectiveness and their ability to do so, As well as analysis and testing of influence and correlations between research variables, The researcher has the descriptive and analytical approach, And several tools used to gather information consisted of personal interviews and field visits, While distributed questionnaire tool that consists of (35) items on a sample consisting of (147) individuals formed of staff of the Baghdad Provincial Council.
The research was based on a set of assum
... Show MoreThe expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of man
... Show More