OpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for adjustment, while the latter encompasses six. Analysis within the selected region exposed variances in positional accuracy, with distinctions evident between Easting (E) and Northing (N) coordinates. Empirical results indicated that the conformal transformation method reduced the Root Mean Square Error (RMSE) by 4.434 meters in the amended OSM data. Contrastingly, the affine transformation method exhibited a further reduction in total RMSE by 4.053 meters. The deployment of these proposed techniques substantiates a marked enhancement in the geometric fidelity of OSM data. The refined datasets have significant applications, extending to the representation of roadmaps, the analysis of traffic flow, and the facilitation of urban planning initiatives.
The role and procedures for accountability in the spending units significant impact in enhancing the results of the budget it is found by studying and comparing the results of the implementation of the budget for the research sample for a period of two years to repeat the same deviations and irregularities of Guanyin, regulations and instructions that have occurred in implementation results and two consecutive Kaltjaoz customizations and low rates of implementation and the lack of exchange of allocations Finance despite the presence budget note that such remarks are always indicate in regulatory reports, but these observations repeated Bostmr in the results of the implementation of the budget. Which clearly reflects the absence of the ro
... Show MoreThe research aims to introduce international valuation standards and to identify the relationship between international valuation standards and international accounting and financial reporting standards in enhancing the quality of financial reporting (appropriate accounting information) through the use of statistical models for the purpose of measuring the property of appropriateness of accounting information through the use of statistical models for the purpose of proving the hypothesis that The research referred to it, and accordingly, the Francis and Kothari models were used to measure the appropriateness of accounting information (the quality of the information). The conclusions reached by the two researchers is that the sett
... Show MoreThe research involved a rapid, automated and highly accurate developed CFIA/MZ technique for estimation of phenylephrine hydrochloride (PHE) in pure, dosage forms and biological sample. This method is based on oxidative coupling reaction of 2,4-dinitrophenylhydrazine (DNPH) with PHE in existence of sodium periodate as oxidizing agent in alkaline medium to form a red colored product at ʎmax )520 nm (. A flow rate of 4.3 mL.min-1 using distilled water as a carrier, the method of FIA proved to be as a sensitive and economic analytical tool for estimation of PHE.
Within the concentration range of 5-300 μg.mL-1, a calibration curve was rectilinear, where the detection limit was 3.252 μg.mL
The research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.
The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res
... Show MoreThe current world is observing huge developments in presenting the opportunity for organizations and administrative units to use information and communication technology and their adoption by administrative work due to its importance in the achievement of work with higher efficiency, speed, and facility of communication with all individuals and companies using various means of communication Depending on the Internet networks. Therefore, the research dealt with the study of electronic systems designed and adopted in the creation or construction of a database for archiving data, which is the main method in organizations and administrative units in developed countries. Where this system works to convert documents, and manual processes and t
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreSolar photovoltaic (PV) system has emerged as one of the most promising technology to generate clean energy. In this work, the performance of monocrystalline silicon photovoltaic module is studied through observing the effect of necessary parameters: solar irradiation and ambient temperature. The single diode model with series resistors is selected to find the characterization of current-voltage (I-V) and power-voltage (P-V) curves by determining the values of five parameters ( ). This model shows a high accuracy in modeling the solar PV module under various weather conditions. The modeling is simulated via using MATLAB/Simulink software. The performance of the selected solar PV module is tested experimentally for differ
... Show More