Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
This research study Blur groups (Fuzzy Sets) which is the perception of the most modern in the application in various practical and theoretical areas and in various fields of life, was addressed to the fuzzy random variable whose value is not real, but the numbers Millbh because it expresses the mysterious phenomena or uncertain with measurements are not assertive. Fuzzy data were presented for binocular test and analysis of variance method of random Fuzzy variables , where this method depends on a number of assumptions, which is a problem that prevents the use of this method in the case of non-realized.
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show MoreSeveral stress-strain models were used to predict the strengths of steel fiber reinforced concrete, which are distinctive of the material. However, insufficient research has been done on the influence of hybrid fiber combinations (comprising two or more distinct fibers) on the characteristics of concrete. For this reason, the researchers conducted an experimental program to determine the stress-strain relationship of 30 concrete samples reinforced with two distinct fibers (a hybrid of polyvinyl alcohol and steel fibers), with compressive strengths ranging from 40 to 120 MPa. A total of 80% of the experimental results were used to develop a new empirical stress-strain model, which was accomplished through the application of the parti
... Show MoreIn this research, the mechanism of cracks propagation for epoxy/ chopped carbon fibers composites have been investigated .Carbon fibers (5%, 10%, 15%, and 20%) by weight were used to reinforce epoxy resin. Bending test was carried out to evaluate the flexural strength in order to explain the mechanism of cracks propagation. It was found that, the flexural strength will increase with increasing the percentage weight for carbon fibers. At low stresses, the cracks will state at the lower surface for the specimen. Increasing the stresses will accelerate the speed of cracks until fracture accorded .The path of cracks is changed according to the distributions of carbon fibers
Deconstructionism opened the door wide to multiple readings and restore the reader his authority that he lost in the modernism, thus became more able to decipher the plastic discourse through reconstruction according to what he wants or what the plastic discourse gives him of possibilities beyond consumerism and thus the author has been canceled. The problem of the current research is limited to the following question: does deconstructionism in postmodern arts have a role in teaching the artistic tasting for the learner? The aim of the current research is to reveal the deconstruction work mechanisms in postmodern arts and their role in teaching the artistic tasting for the learner. As for the theoretical framework, the first section focu
... Show MoreWith the increasing integration of computers and smartphones into our daily lives, in addition to the numerous benefits it offers over traditional paper-based methods of conducting affairs, it has become necessary to incorporate one of the most essential facilities into this integration; namely: colleges. The traditional approach for conducting affairs in colleges is mostly paper-based, which only increases time and workload and is relatively decentralized. This project provides educational and management services for the university environment, targeting the staff, the student body, and the lecturers, on two of the most used platforms: smartphones and reliable web applications by clo
In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights
In this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin
Abstract
For sparse system identification,recent suggested algorithms are
-norm Least Mean Square (
-LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named
-ZA-LMS,