The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals don’t have the serial correlation and ARCH effect, as well as these models, should have a higher value of log-likelihood and SVR-FIGARCH models managed to outperform FIGARCH models with normal and student’s t distributions. The SVR-FIGARCH model exhibited statistical significance and improved accuracy obtained with the SVM technique. Finally, we evaluate the forecasting performance of the various volatility models, and then we choose the best fitting model to forecast the volatility for each series, depending on three forecasting accuracy measures RMSE, MAE, and MAPE.
Background Parkinson’s disease (PD) is currently the fastest-growing neurological disorder in the world. Patients with PD face numerous challenges in managing their chronic condition, particularly in countries with scarce healthcare infrastructure. Objective This qualitative study aimed to delve into neurologists’ perspectives on challenges and gaps in the Iraqi healthcare system that influence the management of PD, as well as strategies to mitigate these obstacles. Method Semi-structured interviews were conducted with neurologists from five different Iraqi provinces, working in both hospitals and private neurology clinics, between November 2024 and January 2025. A thematic analysis approach was employed to identify the main challenge
... Show MoreOily wastewater is one of the most challenging streams to deal with especially if the oil exists in emulsified form. In this study, electrospinning method was used to prepare nanofiberous polyvinylidene fluoride (PVDF) membranes and study their performance in oil removal. Graphene particles were embedded in the electrospun PVDF membrane to enhance the efficiency of the membranes. The prepared membranes were characterized using a scanning electron microscopy (SEM) to verify the graphene stabilization on the surface of the membrane homogeneously; while FTIR was used to detect the functional groups on the membrane surface. The membrane wettability was assessed by measuring the contact angle. The PVDF and PVDF / Graphene membranes efficiency
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThis study has been carried out in the animal field of the college of agricultural engineering sciences, university of Baghdad, for the period from 12/15/2021 to 1/26 /2022 for 42 d, to investigate the effect of adding different levels of ellagic acid to the diet of broilers, on some physiological characteristics & oxidation indicators in meat compared to vitamin C in meat, 225 Ross 308 chicks were used, divided randomly to five treatments such us: T1: control group without additives to diet, & the other T2, T3, T4 was added ellagic acid (
... Show MoreCloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show More