Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is preprocessed and transform of into some intermediate form which can be compressed with better efficiency and security. This solves some problems relevant to the common encryption methods which generally manipulate an entire data set, most encryption algorithms tend to make the transfer of information more costly in terms of time and sometimes bandwidth.
A technique for noise removal is proposed based on slantlet transform. The proposed algorithm tends to reduce the computational time by reducing the total number of frames through dividing the video film into sub films, finding master frames, applying the slantlet transform which is orthogonal discrete wavelet transform with two zero moments and with improved time localization. Thresholding technique is applied to the details coefficients of the slantlet transform .The denoised frame is repeated to retain the original frame sequence. The proposed method was applied by using MATLAB R2010a with video contaminated by white Gaussian noise .The experimental results show that the proposed method provides better subjective and object
... Show MoreIn this paper all possible regressions procedure as well as stepwise regression procedure were applied to select the best regression equation that explain the effect of human capital represented by different levels of human cadres on the productivity of the processing industries sector in Iraq by employing the data of a time series consisting of 21 years period. The statistical program SPSS was used to perform the required calculations.
The efforts in designing and developing lightweight cryptography (LWC) started a decade ago. Many scholarly studies in literature report the enhancement of conventional cryptographic algorithms and the development of new algorithms. This significant number of studies resulted in the rise of many review studies on LWC in IoT. Due to the vast number of review studies on LWC in IoT, it is not known what the studies cover and how extensive the review studies are. Therefore, this article aimed to bridge the gap in the review studies by conducting a systematic scoping study. It analyzed the existing review articles on LWC in IoT to discover the extensiveness of the reviews and the topics covered. The results of the study suggested that many re
... Show MoreThe aim of this work is to design an algorithm which combines between steganography andcryptography that can hide a text in an image in a way that prevents, as much as possible, anysuspicion of the hidden textThe proposed system depends upon preparing the image data for the next step (DCT Quantization)through steganographic process and using two levels of security: the RSA algorithm and the digitalsignature, then storing the image in a JPEG format. In this case, the secret message will be looked asplaintext with digital signature while the cover is a coloured image. Then, the results of the algorithmare submitted to many criteria in order to be evaluated that prove the sufficiency of the algorithm andits activity. Thus, the proposed algorit
... Show MoreThis paper proposes a novel method for generating True Random Numbers (TRNs) using electromechanical switches. The proposed generator is implemented using an FPGA board. The system utilizes the phenomenon of electromechanical switch bounce to produce a randomly fluctuated signal that is used to trigger a counter to generate a binary random number. Compared to other true random number generation methods, the proposed approach features a high degree of randomness using a simple circuit that can be easily built using off-the-shelf components. The proposed system is implemented using a commercial relay circuit connected to an FPGA board that is used to process and record the generated random sequences. Applying statistical testing on the exp
... Show MoreBlockchain is an innovative technology that has gained interest in all sectors in the era of digital transformation where it manages transactions and saves them in a database. With the increasing financial transactions and the rapidly developed society with growing businesses many people looking for the dream of a better financially independent life, stray from large corporations and organizations to form startups and small businesses. Recently, the increasing demand for employees or institutes to prepare and manage contracts, papers, and the verifications process, in addition to human mistakes led to the emergence of a smart contract. The smart contract has been developed to save time and provide more confidence while dealing, as well a
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreThis work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the chan
DNA methylation is one of the main epigenetic mechanisms in cancer development and progression. Aberrant DNA methylation of CpG islands within promoter regions contributes to the dysregulation of various tumor suppressors and oncogenes; this leads to the appearance of malignant features, including rapid proliferation, metastasis, stemness, and drug resistance. The discovery of two important protein families, DNA methyltransferases (DNMTs) and Ten-eleven translocation (TET) dioxygenases, respectively, which are responsible for deregulated transcription of genes that play pivotal roles in tumorigenesis, led to further understanding of DNA methylation-related pathways. But how these enzymes can target specific genes in different malignancies;
... Show MoreThe aims of this thesis are to study the topological space; we introduce a new kind of perfect mappings, namely j-perfect mappings and j-ω-perfect mappings. Furthermore, we devoted to study the relationship between j-perfect mappings and j-ω-perfect mappings. Finally, certain theorems and characterization concerning these concepts are studied. On the other hand, we studied weakly/ strongly forms of ω-perfect mappings, namely -ω-perfect mappings, weakly -ω-perfect mappings and strongly-ω-perfect mappings; also, we investigate their fundamental properties. We devoted to study the relationship between weakly -ω-perfect mappings and strongly -ω-perfect mappings. As well as, some new generalizations of some definitions wh
... Show More