Recently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists of three parts to assess OSM data accuracy: input data, measured and analysis, output results. The output of Matlab program has been represented as graphs. These graphs showed the number of roads during different periods such as each half meter or one meter for length and every half degree for directions, and so on .The results of the compared datasets for two case studies give the large number of roads during the first period. This indicates that the differences between compared datasets were small. The results showed that the case study of Baghdad was more accurate than the case study of holy Karbala.
In the literature, several correlations have been proposed for bubble size prediction in bubble columns. However these correlations fail to predict bubble diameter over a wide range of conditions. Based on a data bank of around 230 measurements collected from the open literature, a correlation for bubble sizes in the homogenous region in bubble columns was derived using Artificial Neural Network (ANN) modeling. The bubble diameter was found to be a function of six parameters: gas velocity, column diameter, diameter of orifice, liquid density, liquid viscosity and liquid surface tension. Statistical analysis showed that the proposed correlation has an Average Absolute Relative Error (AARE) of 7.3 % and correlation coefficient of 92.2%. A
... Show MoreAmong many problems that reduced the performance of the network, especially Wide Area Network, congestion is one of these, which is caused when traffic request reaches or exceeds the available capacity of a route, resulting in blocking and less throughput per unit time. Congestion management attributes try to manage such cases. The work presented in this paper deals with an important issue that is the Quality of Service (QoS) techniques. QoS is the combination effect on service level, which locates the user's degree of contentment of the service. In this paper, packet schedulers (FIFO, WFQ, CQ and PQ) were implemented and evaluated under different applications with different priorities. The results show that WFQ scheduler gives acceptable r
... Show MoreThis study uses an Artificial Neural Network (ANN) to examine the constitutive relationships of the Glass Fiber Reinforced Polymer (GFRP) residual tensile strength at elevated temperatures. The objective is to develop an effective model and establish fire performance criteria for concrete structures in fire scenarios. Multilayer networks that employ reactive error distribution approaches can determine the residual tensile strength of GFRP using six input parameters, in contrast to previous mathematical models that utilized one or two inputs while disregarding the others. Multilayered networks employing reactive error distribution technology assign weights to each variable influencing the residual tensile strength of GFRP. Temperatur
... Show MoreNovel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twentyfour samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.
In this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreThis paper explores VANET topics: architecture, characteristics, security, routing protocols, applications, simulators, and 5G integration. We update, edit, and summarize some of the published data as we analyze each notion. For ease of comprehension and clarity, we give part of the data as tables and figures. This survey also raises issues for potential future research topics, such as how to integrate VANET with a 5G cellular network and how to use trust mechanisms to enhance security, scalability, effectiveness, and other VANET features and services. In short, this review may aid academics and developers in choosing the key VANET characteristics for their objectives in a single document.
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThe investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreImage Fusion Using A Convolutional Neural Network