The city of Karbala is one of the most important holy places for visitors and pilgrims from the Islamic faith, especially through the Arabian visit, when crowds of millions gather to commemorate the martyrdom of Imam Hussein. Offering services and medical treatments during this time is very important, especially when the crowds head to their destination (the holy shrine of Imam Hussein (a.s)). In recent years, the Arba'in visit has witnessed an obvious growth in the number of participants. The biggest challenge is the health risks, and the preventive measures for both organizers and visitors. Researchers identified various challenges and factors to facilitating the Arba'in visit. The purpose of this research is to deal with the religious and cultural events that occur during the Arba'in visit in Iraq by providing optimal and alternatives routes, and strategic resting points along the way from all cites to Karbala. This research depends on data analysis and artificial intelligence methods to determine the best routes and determine locations of the rest points accurately and effectively. These aims will be accomplished by analysing population distribution and potential paths. For the purpose of providing the best rest points on the proposed roads and decreasing the crowds within these stations, the rest stations are divided into two categories: main stations and sub-stations. The main stations contain services such as: rest places, accommodation, health and awareness services, in addition to providing food and drink; whereas the sub-stations comprise only rest places, sleep, food and drink. The research suggests that the main stations
Three types of medical commercial creams Silvazine, Cinolon Tar and Hydroquinon Domina were incorporated in this study. The medical creams were taken directly and placed uniformly on the glass slide. Each type of pharmaceutical was weighed at 1 mg and dispersed on an area of 1x1 cm. This process ensures same thickness for all samples. The creams were analyzed by using double-beam UV/visible spectrophotometer Metertech SP8001. The absorption spectrum for each of samples was measured against wavelength range of 300–700 nm.
Recently, that there has been a decline in the levels of female players for this event in recent years compared to developments in the world, as this activity depends to achieve the highest level of performance on the physical capabilities and physiological indicators of the player who It is reflected in the achievement, which results in the loss of time for Iraqi female runners compared to the world champions in the (100) meter hurdles competition, which reflects on the level of achievement. The two researchers used the experimental approach in addressing the research problem, and the experimental design used the method of the two equal groups, the two experiments with the pre and post-tests. Female runner and second experimental group (4)
... Show MoreBN Rashid, International Journal of Research in Social Sciences and Humanities, 2019 - Cited by 1
Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
The research aims to identify the possibility of applying environmental fines to commercial shops and restaurants to reduce the environmental pollution represented by the wastes generated from them. The research sample was divided into two groups, including the first (20) commercial shops (meat shops and slaughter it, fruits & vegetables, legumes and accessories) and second (30) Restaurant in the city of Baghdad on both sides of Karkh and Rusafa. The quality of the waste was classified into carton, plastic, aluminum, glass, paper, cork and food waste. The study revealed the possibility of applying environmental fines to restaurants and shops to reduce the waste generated from them throughout the year and to apply continuous monitorin
... Show More