Preferred Language
Articles
/
joe-342
Development of Spatial Data Infrastructure based on Free Data Integration
...Show More Authors

In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the integration of free geospatial data can be beneficial within domains such as Spatial Data Infrastructures. This was carried out by suggesting a common methodology that uses road networks information such as lengths, centeroids, start and end points, number of nodes and directions to integrate free and open source geospatial datasets. The methodology has been proposed for a particular case study: the use of geospatial data from OpenStreetMap and Google Earth datasets as examples of free data sources. The results revealed possible matching between the roads of OpenStreetMap and Google Earth datasets to serve the development of Spatial Data Infrastructures.

 

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Mar 01 2009
Journal Name
Baghdad Science Journal
ON NAIVE TAYLOR MODEL INTEGRATION METHOD
...Show More Authors

Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 05 2016
Journal Name
Wireless Communications And Mobile Computing
A comparison study on node clustering techniques used in target tracking WSNs for efficient data aggregation
...Show More Authors

Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati

... Show More
View Publication
Scopus (31)
Crossref (24)
Scopus Clarivate Crossref
Publication Date
Mon Aug 01 2022
Journal Name
Baghdad Science Journal
New and Existing Approaches Reviewing of Big Data Analysis with Hadoop Tools
...Show More Authors

Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, sc

... Show More
View Publication Preview PDF
Scopus (11)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Fri Jul 01 2016
Journal Name
Journal Of Engineering
Data Base for Dynamic Soil Properties of Seismic Active Zones in Iraq
...Show More Authors

Iraq is located near the northern tip of the Arabian plate, which is advancing northwards relative to the Eurasian plate, and is predictably, a tectonically active country. Seismic activity in Iraq increased significantly during the last decade. So structural and geotechnical engineers have been giving increasing attention to the design of buildings for earthquake resistance. Dynamic properties play a vital role in the design of structures subjected to seismic load. The main objective of this study is to prepare a data base for the dynamic properties of different soils in seismic active zones in Iraq using the results of cross hole and down hole tests. From the data base collected it has been observed that the average ve

... Show More
View Publication Preview PDF
Publication Date
Wed Jul 06 2022
Journal Name
Journal Of Asian Multicultural Research For Social Sciences Study
Remote Data Auditing in a Cloud Computing Environment
...Show More Authors

In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv

... Show More
View Publication
Crossref
Publication Date
Sat Sep 08 2018
Journal Name
Proceedings Of The 2018 International Conference On Computing And Big Data
3D Parallel Coordinates for Multidimensional Data Cube Exploration
...Show More Authors

Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Dec 01 2018
Journal Name
Journal Of Hydrology
Complementary data-intelligence model for river flow simulation
...Show More Authors

View Publication
Crossref (88)
Crossref
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Some NONPARAMETRIC ESTIMATORS FOR RIGHT CENSORED SURVIVAL DATA
...Show More Authors

The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.

For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Dec 01 2012
Journal Name
Journal Of Economics And Administrative Sciences
Using panel data in structural equations with application
...Show More Authors

The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration.  chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 12 2019
Journal Name
Journal Of Economics And Administrative Sciences
Accounting Mining Data Using Neural Networks (Case study)
...Show More Authors

Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a

... Show More
View Publication Preview PDF
Crossref (1)
Crossref