The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the changes in the reordering letters or words of the paragraph and text without changing the letters themselves. So, the next step uses one of the error detection methods which is named Hamming Codes to find out the locations of the change in the received text. After that; at the third step, RSA as a primary encryption method has been used to encrypt the keys extracted to the first step and second step in order to prevent the intruders to break down the security of the method.
The emerge of capitalism beside appearing modern and contemporary political systems which had become hold out it is semi-domination on more vital space of human community life, it is through some vital apparatus, which the free market apparatus had make important one which depend on achieve the privileges of the capitalism elite whom standing on it, especially the finance elite. Thus the achievement of the profit had become the main podcasted of those elite which whom the really advancer of the Globalization system, this is which incarnated by the appears and extend of the (COVID-19) fatality pandemic in the end of last year, whereas reveals widespread of it in more than one states in the world, especially the developed coun
... Show More: zonal are included in phraseological units, form metaphorical names for a person, give him various emotional and evaluative characteristics. This article examines the topic of zoomorphic metaphors that characterize a person in the Russian and Arabic languages in the aspect of their comparative analysis, since the comparative analysis of the metaphorical meanings of animalisms is an important method for studying cultural linguistics, since zoomorphic metaphors are a reflection of culture in a language.
|
One of the most powerful tools for any stellar dynamics is the N-body simulation. In an N-body simulation the motion of N particles is followed under their mutual gravitational attraction. In this paper the gravitational N-body simulation is described to investigate Newtonian and non- Newtonian (modified Newtonian dynamics) interaction between the stars of spiral galaxies. It is shown that standard Newtonian interaction requires dark matter to produce the flat rotational curves of the systems under consideration, while modified Newtonian dynamics (MOND) theorem provides a flat rotational curve and gives a good agreement with the observed rotation cu |
Calcium-Montmorillonite (bentonite) [Ca-MMT] has been prepared via cation exchange reaction using benzalkonium chloride [quaternary ammonium] as a surfactant to produce organoclay which is used to prepare polymer composites. Functionalization of this filler surface is very important factor for achieving good interaction between filler and polymer matrix. Basal spacing and functional groups identification of this organoclay were characterized using X-Ray Diffraction (XRD) and Fourier Transform Infrared (FTIR) spectroscopy respectively. The (XRD) results showed that the basal spacing of the treated clay (organoclay) with the benzalkonium chloride increased to 15.17213 0A, this represents an increment of about 77.9% in the
... Show MoreThe availability of different processing levels for satellite images makes it important to measure their suitability for classification tasks. This study investigates the impact of the Landsat data processing level on the accuracy of land cover classification using a support vector machine (SVM) classifier. The classification accuracy values of Landsat 8 (LS8) and Landsat 9 (LS9) data at different processing levels vary notably. For LS9, Collection 2 Level 2 (C2L2) achieved the highest accuracy of (86.55%) with the polynomial kernel of the SVM classifier, surpassing the Fast Line-of-Sight Atmospheric Analysis of Spectral Hypercubes (FLAASH) at (85.31%) and Collection 2 Level 1 (C2L1) at (84.93%). The LS8 data exhibits similar behavior. Conv
... Show MoreSchiff bases were prepared prepared Baaan NMR to some elements of which have contributed to the results of different methods in diagnosis prove structural formulas of compounds prepared
New derivatives of pyromellitamic diacids and pyromellitdiimides have been prepared by the reaction of one mole of pyromellitic dianhydride with two moles of aromatic amines, these derivatives were characterized by elemental analysis, FT-IR and melting point.
Autism is a lifelong developmental deficit that affects how people perceive the world and interact with each others. An estimated one in more than 100 people has autism. Autism affects almost four times as many boys than girls. The commonly used tools for analyzing the dataset of autism are FMRI, EEG, and more recently "eye tracking". A preliminary study on eye tracking trajectories of patients studied, showed a rudimentary statistical analysis (principal component analysis) provides interesting results on the statistical parameters that are studied such as the time spent in a region of interest. Another study, involving tools from Euclidean geometry and non-Euclidean, the trajectory of eye patients also showed interesting results. In this
... Show MoreWe propose a system to detect human faces in color images type BMP by using two methods RGB and YCbCr to determine which is the best one to be used, also determine the effect of applying Low pass filter, Contrast and Brightness on the image. In face detection we try to find the forehead from the binary image by scanning of the image that starts in the middle of the image then precedes by finding the continuous white pixel after continuous black pixel and the maximum width of the white pixel by scanning left and right vertically(sampled w) if the new width is half the previous one the scanning stops.