Preferred Language
Articles
/
joe-1635
State-of-the-Art in Data Integrity and Privacy-Preserving in Cloud Computing
...Show More Authors

Cloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to make the cloud and data security challenges more understandable, to briefly explain the techniques used to achieve privacy and data integrity, to compare various recent studies in both pre-quantum and post-quantum, and to focus on current gaps in solving privacy and data integrity issues.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Oct 01 2015
Journal Name
Journal Of Engineering
Development of Spatial Data Infrastructure based on Free Data Integration
...Show More Authors

In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho

... Show More
Crossref
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Data Aggregation in Wireless Sensor Networks Using Modified Voronoi Fuzzy Clustering Algorithm
...Show More Authors

Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le

... Show More
View Publication Preview PDF
Publication Date
Sun Aug 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Financial Innovation as an Entrance to Sustainable Financing A Case Study of Islamic Banks in the State of Qatar (2014-2018)
...Show More Authors

While the impact of the fourth Industrial Revolution on the economy keeps accelerating, the signs of the fifth industrial revolution, whose key is innovation and creativity started to evolve. However, the challenge of achieving sustainable development and its goals remains faced by the global organizations; In this situation, Islamic banks are exposed to many challenges among which is the challenge of keeping themselves abreast of the latest developments in the modern technology which in turn is a tool for continuity and competition. On the flip side, to avoid the negative impact that these changes can have such as an increased gap between financial innovations and the requirements of sustainable development. Islamic banks in the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Geological Journal
Inverting Gravity Data to Density and Velocity Models for Selected Area in Southwestern Iraq
...Show More Authors

The gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.

... Show More
View Publication Preview PDF
Scopus (3)
Scopus Crossref
Publication Date
Tue Feb 28 2023
Journal Name
Iraqi Journal Of Science
Properties of Ground-State of 17,19,20,24,26F using the Wave Functions of Harmonic-Oscillator and Spherical Hankel Functions
...Show More Authors

     The nuclear size radii, density distributions and elastic electron scattering charge form factors for Fluorine isotopes (17,19,20,24,26F) were studied using the radial wave functions (WF) of harmonic-oscillator (HO) potential and free mean field described by spherical Hankel functions (SHF) for the core and the valence parts, respectively for all aforementioned isotopes. The parameters for HO potential (size parameter ) and SHF were chosen to regenerate the available experimental size radii. It was found that using spherical Hankel functions in our work improved the calculated results quantities in comparison with empirical data.   

View Publication
Scopus (3)
Crossref (1)
Scopus Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
User (K-Means) for clustering in Data Mining with application
...Show More Authors

 

 

  The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.

      And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 01 2016
Journal Name
Gis Research Uk 24th Annual Conference
Comparing Open Source Map Data in Areas Lacking Authoritative Mapping
...Show More Authors

One wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.

Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jul 05 2017
Journal Name
Neural Computing And Applications
Hybrid soft computing approach for determining water quality indicator: Euphrates River
...Show More Authors

View Publication
Scopus (36)
Crossref (31)
Scopus Clarivate Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Ieee Transactions On Very Large Scale Integration (vlsi) Systems
Low-Power, Highly Reliable Dynamic Thermal Management by Exploiting Approximate Computing
...Show More Authors

With the continuous downscaling of semiconductor processes, the growing power density and thermal issues in multicore processors become more and more challenging, thus reliable dynamic thermal management (DTM) is required to prevent severe challenges in system performance. The accuracy of the thermal profile, delivered to the DTM manager, plays a critical role in the efficiency and reliability of DTM, different sources of noise and variations in deep submicron (DSM) technologies severely affecting the thermal data that can lead to significant degradation of DTM performance. In this article, we propose a novel fault-tolerance scheme exploiting approximate computing to mitigate the DSM effects on DTM efficiency. Approximate computing in hardw

... Show More
View Publication
Scopus (5)
Crossref (5)
Scopus Clarivate Crossref