In the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harnesses the unique attributes of this language, encompassing its complex character designs, diacritical marks, and ligatures, to effectively protect information. In this work, we propose a new text steganography method based on Arabic language characteristics concealment, where the proposed method has two levels of security which are: Arabic encoding and word shifting. In the first step, build a new Arabic encoding mapping table to convert an English plaintext to Arabic characters, then use a word shifting process to add an authentication phase for the sending message and add another level of security to the achieved ciphertext. The proposed method showed that Arabic language characteristics steganography achieved 0.15 ms for 1 k, 1.0033 ms for 3 k, 2.331 ms for 5 k, and 5.22 ms for 10 k file sizes respectively.
The Cu(II) was found using a quick and uncomplicated procedure that involved reacting it with a freshly synthesized ligand to create an orange complex that had an absorbance peak of 481.5 nm in an acidic solution. The best conditions for the formation of the complex were studied from the concentration of the ligand, medium, the eff ect of the addition sequence, the eff ect of temperature, and the time of complex formation. The results obtained are scatter plot extending from 0.1–9 ppm and a linear range from 0.1–7 ppm. Relative standard deviation (RSD%) for n = 8 is less than 0.5, recovery % (R%) within acceptable values, correlation coeffi cient (r) equal 0.9986, coeffi cient of determination (r2) equal to 0.9973, and percentage capita
... Show MoreTwo EM techniques, terrain conductivity and VLF-Radiohm resistivity (using two
different instruments of Geonics EM 34-3 and EMI6R respectively) have been applied to
evaluate their ability in delineation and measuring the depth of shallow subsurface cavities
near Haditha city.
Thirty one survey traverses were achieved to distinguish the subsurface cavities in the
investigated area. Both EM techniques are found to be successfiul tools in study area.
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
The present paper focuses on the study of some characteristics of
comets ions by photometry method which represent by CCD camera
which it provide seeing these images in a graded light. From 0-255
when Zero (low a light intensity) and 255 (highlight intensity). These
differences of photonic intensity can be giving us a curve which
appear from any line of this image.
From these equations the focus is concentrating on determine the
temperature distribution, velocity distribution, and intensity number
distribution which is give number of particles per unit volume.
The results explained the interaction near the cometary nucleus
which is mainly affected by the new ions added to the density of the
solar wind, th
Abstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreThe aim of the study was to evaluate the efficacy of diode laser (λ=940 nm) in the management of gingival hyperpigmentation compared to the conventional bur method. Materials and methods: Eighteen patients with gingival hyperpigmentation were selected for the study with an age between 12-37 years old. The site of treatment was the upper gingiva using diode laser for the right half and the conventional method for the left half. All patients were re-evaluated after the following intervals: 3 days, 7 days, 1 month and 6 months post-operation. Pain and functions were re-evaluated in each visit for a period of 1 day, 3 days and 1 week post-operation. Laser parameters included 1.5 W in continuous mode with an initiated tip (400 μm) placed in
... Show More