One of the most difficult issues in the history of communication technology is the transmission of secure images. On the internet, photos are used and shared by millions of individuals for both private and business reasons. Utilizing encryption methods to change the original image into an unintelligible or scrambled version is one way to achieve safe image transfer over the network. Cryptographic approaches based on chaotic logistic theory provide several new and promising options for developing secure Image encryption methods. The main aim of this paper is to build a secure system for encrypting gray and color images. The proposed system consists of two stages, the first stage is the encryption process, in which the keys are generated depending on the chaotic logistic with the image density to encrypt the gray and color images, and the second stage is the decryption, which is the opposite of the encryption process to obtain the original image. The proposed method has been tested on two standard gray and color images publicly available. The test results indicate to the highest value of peak signal-to-noise ratio (PSNR), unified average changing intensity (UACI), number of pixel change rate (NPCR) are 7.7268, 50.2011 and 100, respectively. While the encryption and decryption speed up to 0.6319 and 0.5305 second respectively.
Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and e
... Show MoreThe gas-lift method is crucial for maintaining oil production, particularly from an established field when the natural energy of the reservoirs is depleted. To maximize oil production, a major field's gas injection rate must be distributed as efficiently as possible across its gas-lift network system. Common gas-lift optimization techniques may lose their effectiveness and become unable to replicate the gas-lift optimum in a large network system due to problems with multi-objective, multi-constrained & restricted gas injection rate distribution. The main objective of the research is to determine the possibility of using the genetic algorithm (GA) technique to achieve the optimum distribution for the continuous gas-lift injectio
... Show MorePeer-Reviewed Journal
Earth’s climate changes rapidly due to the increases in human demands and rapid economic growth. These changes will affect the entire biosphere, mostly in negative ways. Predicting future changes will put us in a better position to minimize their catastrophic effects and to understand how humans can cope with the new changes beforehand. In this research, previous global climate data set observations from 1961-1990 have been used to predict the future climate change scenario for 2010-2039. The data were processed with Idrisi Andes software and the final Köppen-Geiger map was created with ArcGIS software. Based on Köppen climate classification, it was found that areas of Equator, Arid Steppes, and Snow will decrease by 3.9 %, 2.96%, an
... Show MoreFG Mohammed, HM Al-Dabbas, Science International, 2018 - Cited by 2
Compaction of triticale grain with three moisture contents (8%, 12%, and 16% wet basis) was measured at five applied pressures (0, 7, 14, 34, and 55 kPa). Bulk density increased with increasing pressure for all moisture contents and was significantly (p < 0.0001) dependent on both moisture content and applied pressure. A Verhulst logistic equation was found to model the changes in bulk density of triticale grain with R2 of 0.986. The model showed similar beha
Abstract: The aim of this study was to evaluate the effect of bone density value in Hounsfield unit derived from cone beam computed tomography (CBCT), and implant dimensions in relation to implant stability parameters namely the resonance frequency analysis and the insertion torque (IT) value. It included 24 patients who received 42 dental implants (DI). The bone density of the planned implant site was preoperatively measured using cone beam computed tomography. The implant stability was measured using Osstell implant stability quotient (ISQ). The ISQ values were recorded immediately postoperatively and after 16 weeks. The IT value was categorized as 35 N/cm or > 35 N/cm. The mean (standard deviation) primary stability was 79.58 (5.27) ISQ,
... Show Morepaper
One wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.
In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show More