The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given control parameters, which is then processed by converting the fractional parts of them through a function into a set of non-repeating numbers that leads to a vast number of unpredicted probabilities (the factorial of rows times columns). Double layers of rows and columns permutation are made to the values of numbers for a specified number of stages. Then, XOR is performed between the key matrix and the original image, which represent an active resolve for data encryption for any type of files (text, image, audio, video, … etc). The results proved that the proposed encryption technique is very promising when tested on more than 500 image samples according to security measurements where the histograms of cipher images are very flatten compared with that for original images, while the averages of Mean Square Error is very high (10115.4) and Peak Signal to Noise Ratio is very low (8.17), besides Correlation near zero and Entropy close to 8 (7.9975).
Semantic segmentation is an exciting research topic in medical image analysis because it aims to detect objects in medical images. In recent years, approaches based on deep learning have shown a more reliable performance than traditional approaches in medical image segmentation. The U-Net network is one of the most successful end-to-end convolutional neural networks (CNNs) presented for medical image segmentation. This paper proposes a multiscale Residual Dilated convolution neural network (MSRD-UNet) based on U-Net. MSRD-UNet replaced the traditional convolution block with a novel deeper block that fuses multi-layer features using dilated and residual convolution. In addition, the squeeze and execution attention mechanism (SE) and the s
... Show MoreIn this paper, a modified derivation has been introduced to analyze the construction of C-space. The profit from using C-space is to make the process of path planning more safety and easer. After getting the C-space construction and map for two-link planar robot arm, which include all the possible situations of collision between robot parts and obstacle(s), the A* algorithm, which is usually used to find a heuristic path on Cartesian W-space, has been used to find a heuristic path on C-space map. Several modifications are needed to apply the methodology for a manipulator with degrees of freedom more than two. The results of C-space map, which are derived by the modified analysis, prove the accuracy of the overall C-space mapping and cons
... Show MoreOptical Mark Recognition (OMR) is the technology of electronically extracting intended data from marked fields, such as squareand bubbles fields, on printed forms. OMR technology is particularly useful for applications in which large numbers of hand-filled forms need to be processed quickly and with a great degree of accuracy. The technique is particularly popular with schools and universities for the reading in of multiple choice exam papers. This paper proposed OMRbased on Modify Multi-Connect Architecture (MMCA) associative memory, its work in two phases: training phase and recognition phase. The proposed method was also able to detect more than one or no selected choice. Among 800 test samples with 8 types of grid answer sheets and tota
... Show MoreThe objective of the present study is to verify the actual carious lesion depth by laser
fluorescence technique using 650 nm CW diode laser in comparison with the histopathological
investigation. Five permanent molar teeth were extracted from adult individuals for different reasons
(tooth impaction, periodontal diseases, and pulp infections); their ages were ranging from 20-25 years
old. Different carious teeth with varying clinical stages of caries progression were examined. An
experimental laser fluorescence set-up was built to perform the work regarding in vitro detection and
quantification of occlusal dental caries and the determination of its actual clinical carious lesion depth by
650 nm CW diode laser (excitat
In this work, functionally graded materials were synthesized by centrifugal technique at different
volume fractions 0.5, 1, 1.5, and 2% Vf with a rotation speed of 1200 rpm and a constant rotation time, T
= 6 min . The mechanical properties were characterized to study the graded and non-graded nanocomposites
and the pure epoxy material. The mechanical tests showed that graded and non-graded added alumina
(Al2O3) nanoparticles enhanced the effect more than pure epoxy. The maximum difference in impact strength
occurred at (FGM), which was loaded from the rich side of the nano-alumina where the maximum value was
at 1% Vf by 133.33% of the sample epoxy side. The flexural strength and Young modulus of the fu
Hartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreThe present work describes guggul as a novel carrier for some anti-inflammatory drugs. Guggulusomes containing different concentration of guggul with aceclofenac were prepared by sonication method and characterized for vesicle shape, size, size-distribution, pH, viscosity, spread ability, homogeneity, and accelerated stability in-vitro drug permeation through mouse skin. The vesicles exhibited an entrapment efficiency of 93.2 ± 12%, vesicle size of 0.769 ± 3μm and a zeta potential of - 6.21mV. In vitro drug release was analyzed using Franz’s diffusion cells. The cumulative release of the guggulusomes gel (G2) was 75.8% in 18 hrs, which is greater than that all the gel formulation. The stability profile of prepare
... Show MoreThe study aims to build a water quality index that fits the Iraqi aquatic systems and reflects the environmental reality of Iraqi water. The developed Iraqi Water Quality Index (IQWQI) includes physical and chemical components. To build the IQWQI, Delphi method was used to communicate with local and global experts in water quality indices for their opinion regarding the best and most important parameter we can use in building the index and the established weight of each parameter. From the data obtained in this study, 70% were used for building the model and 30% for evaluating the model. Multiple scenarios were applied to the model inputs to study the effects of increasing parameters. The model was built 4 by 4 until it reached 17 parame
... Show More