Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
This paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number o
... Show MoreAccurate predictive tools for VLE calculation are always needed. A new method is introduced for VLE calculation which is very simple to apply with very good results compared with previously used methods. It does not need any physical property except each binary system need tow constants only. Also, this method can be applied to calculate VLE data for any binary system at any polarity or from any group family. But the system binary should not confirm an azeotrope. This new method is expanding in application to cover a range of temperature. This expansion does not need anything except the application of the new proposed form with the system of two constants. This method with its development is applied to 56 binary mixtures with 1120 equili
... Show MoreThis study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show MoreSpatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati
... Show MoreThe technology in continuous and quick development, that reflects in all parts of our life and interred both scientific and practical fields. Marketing is one of them, a customer’s way to deal with choosing and demanding the product deferent from the traditional way. Some of the buying processes are electronic now, therefore the current research is identifying the digital channels that entered the world of marketing and influenced the activities and types that fall under this name and how it affects in positioning strategy, which is how to install the product or brand in the mind of the customer and was dimensions (brand identity, brand personality, brand communication, brand awareness, brand image), The researcher took t
... Show MoreAbstract
The study seeks to use one of the techniques (Data mining) a (Logic regression) on the inherited risk through the use of style financial ratios technical analysis and then apply for financial fraud indicators,Since higher scandals exposed companies and the failure of the audit process has shocked the community and affected the integrity of the auditor and the reason is financial fraud practiced by the companies and not to the discovery of the fraud by the auditor, and this fraud involves intentional act aimed to achieve personal and harm the interests of to others, and doing (administration, staff) we can say that all frauds carried out through the presence of the motives and factors that help th
... Show Moren this study, data or X-ray images Fixable Image Transport System (FITS) of objects were analyzed, where energy was collected from the body by several sensors; each sensor receives energy within a specific range, and when energy was collected from all sensors, the image was formed carrying information about that body. The images can be transferred and stored easily. The images were analyzed using the DS9 program to obtain a spectrum for each object,an energy corresponding to the photons collected per second. This study analyzed images for two types of objects (globular and open clusters). The results showed that the five open star clusters contain roughly t
... Show More