To achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DNA sequences. These DNA sequences are converted to RNA sequences. Finally, the RNA sequences are converted to the amino acid, where this sequence is considered as ciphertext to be sent to the receiver. The decryption phase also includes six steps, which are the same encryption steps but in reverse order. It starts with converting amino acid to RNA sequences, then converting RNA sequences to DNA sequences and converting them to their equivalent complementary DNA. After that, DNA sequences are converted to binary values and to their equivalent ASCII values. The final step is converting ASCII values to alphabet characters that are considered plaintext. For evaluation purposes, six text files with different sizes have been used as a test material. Performance evaluation is calculated based on encryption time and decryption time. The achieved results are considered as good and fast, where the encryption and decryption times needed for a file with size of 1k are equal to 2.578 ms and 2.625 ms respectively, while the encryption and decryption times for a file with size of 20k are equal to 268.422 ms and 245.469 ms respectively.
This article investigates Iraq wars presentation in literature and media. The first section investigates the case of the returnees from the war and their experience, their trauma and final presentation of that experience. The article also investigates how trauma and fear is depicted to create an optimized image and state of fear that could in turn show Iraqi society as a traumatized society. Critics such as Suzie Grogan believes that the concept of trauma could expand to influence societies rather than one individual after exposure to trauma of being involved in wars and different major conflicts. This is reflected in Iraq as a country that was subjected to six comprehensive conflicts in its recent history, i.e. less than half a century; th
... Show MoreThe absurdity of Orientalist thought and its deviation in interpretation
Quranic text
View and critique
Abstract
This research deals with the technical of opening text during the critical
study about the poem of Al- ssiyab which is named ( city with out rain)
We chose this poem also to make connection with the western critical
theories.
Our study aims to explain the act of opening texts and critics and its
methods and directions in modern Arabic poetry.
It also aims to show the differences in the point of view between critics
and poetries.
The research depends on dimension vision of selective, and descriptive.
Image fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM,
... Show MoreImage pattern classification is considered a significant step for image and video processing.Although various image pattern algorithms have been proposed so far that achieved adequate classification,achieving higher accuracy while reducing the computation time remains challenging to date. A robust imagepattern classification method is essential to obtain the desired accuracy. This method can be accuratelyclassify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism.Moreover, to date, most of the existing studies are focused on evaluating their methods based on specificorthogonal moments, which limits the understanding of their potential application to various DiscreteOrthogonal Moments (DOMs). The
... Show MoreIn this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob
... Show MoreStarting from 4, - Dimercaptobiphenyl, a variety of phenolic Schiff bases (methylolic, etheric, epoxy) derivatives have been synthesized. All proposed structure were supported by FTIR, 1H-NMR, 13C-NMR Elemental analysis all analysis were performed in center of consultation in Jordan Universty.
In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform