OpenStreetMap (OSM) represents the most common example of online volunteered mapping applications. Most of these platforms are open source spatial data collected by non-experts volunteers using different data collection methods. OSM project aims to provide a free digital map for all the world. The heterogeneity in data collection methods made OSM project databases accuracy is unreliable and must be dealt with caution for any engineering application. This study aims to assess the horizontal positional accuracy of three spatial data sources are OSM road network database, high-resolution Satellite Image (SI), and high-resolution Aerial Photo (AP) of Baghdad city with respect to an analogue formal road network dataset obtained from the Mayoralty of Baghdad (MB). The methodology of, U.S. National Standard Spatial Data Accuracy (NSSDA) was applied to measure the degree of agreement between each data source and the formal dataset (MB) in terms of horizontal positional accuracy by computing RMSE and NSSDA values. The study concluded that each of the three data sources does not agree with the MB dataset in both study sites AL-Aadhamiyah and AL-Kadhumiyah in terms of positional accuracy.
Contours extraction from two dimensional echocardiographic images has been a challenge in digital image processing. This is essentially due to the heavy noise, poor quality of these images and some artifacts like papillary muscles, intra-cavity structures as chordate, and valves that can interfere with the endocardial border tracking. In this paper, we will present a technique to extract the contours of heart boundaries from a sequence of echocardiographic images, where it started with pre-processing to reduce noise and produce better image quality. By pre-processing the images, the unclear edges are avoided, and we can get an accurate detection of both heart boundary and movement of heart valves.
The study aims to display the scientific benefit offered by modern electronic programs for various scientific research methods, while determining the positive scientific role played by these programs in modernizing the methodologies and logic of scientific thinking, especially with the rapid development of the sciences and their curricula.
These programs link accurately with scientific results. The importance of the study is to provide practical mechanisms to highlight the scientific projection of the electronic programs in various steps of scientific research.
A case study was used for Tropes version 8.4, which analyzes written, audio and visual semantic texts and presents a set of statistical results that facilitate the difficult
This study was conducted in the Tissue Culture laboratory of the Horticultural Department of the Faculty of Agriculture at Karbala University to investigate the effects of a light source (Florescent, LED) and adenine sulfate (Ads) a 0, 40, 80, and 120 mg l-1 on the multiplication and rooting of
Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreIn light of the development in computer science and modern technologies, the impersonation crime rate has increased. Consequently, face recognition technology and biometric systems have been employed for security purposes in a variety of applications including human-computer interaction, surveillance systems, etc. Building an advanced sophisticated model to tackle impersonation-related crimes is essential. This study proposes classification Machine Learning (ML) and Deep Learning (DL) models, utilizing Viola-Jones, Linear Discriminant Analysis (LDA), Mutual Information (MI), and Analysis of Variance (ANOVA) techniques. The two proposed facial classification systems are J48 with LDA feature extraction method as input, and a one-dimen
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
The impact of mental training overlap on the development of some closed and open skills in five-aside football for middle school students, Ayad Ali Hussein, Haidar Abedalameer Habe
In this study, field results data were conducted, implemented in 64 biofilm reactors to analyses extract organic matter nutrients from wastewater through a laboratory level nutrient removal process, biofilm layer moving process using anaerobic aerobic units. The kinetic layer biofilm reactors were continuously operating in Turbo 4BIO for BOD COD with nitrogen phosphorous. The Barakia plant is designed to serve 200,000 resident works on biological treatment through merge two process (activated sludge process, moving bed bio reactio MBBR) with an average wastewater flow of 50,000 m3/day the data were collected annually from 2017-2020. The water samples were analysis in the central labor