Recently, a new secure steganography algorithm has been proposed, namely, the secure Block Permutation Image Steganography (BPIS) algorithm. The new algorithm consists of five main steps, these are: convert the secret message to a binary sequence, divide the binary sequence into blocks, permute each block using a key-based randomly generated permutation, concatenate the permuted blocks forming a permuted binary sequence, and then utilize a plane-based Least-Significant-Bit (LSB) approach to embed the permuted binary sequence into BMP image file format. The performance of algorithm was given a preliminary evaluation through estimating the PSNR (Peak Signal-to-Noise Ratio) of the stego image for limited number of experiments comprised hiding text files of various sizes into BMP images. This paper presents a deeper algorithm performance evaluation; in particular, it evaluates the effects of length of permutation and occupation ratio on stego image quality and steganography processing time. Furthermore, it evaluates the algorithm performance for concealing different types of secret media, such as MS office file formats, image files, PDF files, executable files, and compressed files.
تعد مجالات الصورة وعلاماتها الحركية حضوراً دلالياً للاتصال العلامي واتساعاً في الرابطة الجدلية ما بين الدوال ومداليها، التي تقوم بها الرؤية الاخراجية لإنتاج دلالات اخفائية تمتلك جوهرها الانتقالي عبر الافكار بوصفها معطيات العرض، ويسعى التشفير الصوري الى بث ثنائية المعنى داخل الحقول المتعددة للعرض المسرحي، ولفهم المعنى المنبثق من هذه التشفيرات البصرية، تولدت الحاجة لبحث تشكيل هذه التشفيرات وكيفية تح
... Show MoreDigital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreWith the increasing rate of unauthorized access and attacks, security of confidential data is of utmost importance. While Cryptography only encrypts the data, but as the communication takes place in presence of third parties, so the encrypted text can be decrypted and can easily be destroyed. Steganography, on the other hand, hides the confidential data in some cover source such that the existence of the data is also hidden which do not arouse suspicion regarding the communication taking place between two parties. This paper presents to provide the transfer of secret data embedded into master file (cover-image) to obtain new image (stego-image), which is practically indistinguishable from the original image, so that other than the indeed us
... Show MoreThe haplotype association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease.Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls.It starts with inferring haplotypes from genotypes followed by a haplotype co-classification and marginal screening for disease-associated haplotypes.Unfortunately,phasing uncertainty may have a strong effects on the haplotype co-classification and therefore on the accuracy of predicting risk haplotypes.Here,to address the issue,we propose an alternative approach:In Stage 1,we select potential risk genotypes inste
... Show MoreIn this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
The objective of the research is to identify the level of supervisory performance of the educational supervisor from the point of view of headmasters at secondary schools. The problem was the need to evaluate performance. A sample of (97) school headmasters was chosen to collect the needed data, they proportionated (38%) of the total community. the researcher designed a questionnaire consisted of (43) paragraphs with five areas. The results showed that there is a good level of performance among supervisors; there are no significant differences in the variable of the certificate, while there were significant differences in terms of gender for the benefit of males. The research concluded with a number of recommendations and suggestions.
... Show MoreThere are many techniques that can be used to estimate the spray quality traits such as the spray coverage, droplet density, droplet count, and droplet diameter. One of the most common techniques is to use water sensitive papers (WSP) as a spray collector on field conditions and analyzing them using several software. However, possible merger of some droplets could occur after they deposit on WSP, and this could affect the accuracy of the results. In this research, image processing technique was used for better estimation of the spray traits, and to overcome the problem of droplet merger. The droplets were classified as non-merged and merged droplets based on their roundness, then the merged droplets were separated based on the average non-m
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show More