An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification, including ResNet50, VGG19, and InceptionV4; They were trained and tested on an open-source satellite image dataset to analyze the algorithms' efficiency and performance and correlated the classification accuracy, precisions, recall, and f1-score. The result shows that InceptionV4 gives the best classification accuracy of 97% for cloudy, desert, green areas, and water, followed by VGG19 with approximately 96% and ResNet50 with 93%. The findings proved that the InceptionV4 algorithm is suitable for classifying oil spills and no spill with satellite images on a validated dataset.
Semantic segmentation is an exciting research topic in medical image analysis because it aims to detect objects in medical images. In recent years, approaches based on deep learning have shown a more reliable performance than traditional approaches in medical image segmentation. The U-Net network is one of the most successful end-to-end convolutional neural networks (CNNs) presented for medical image segmentation. This paper proposes a multiscale Residual Dilated convolution neural network (MSRD-UNet) based on U-Net. MSRD-UNet replaced the traditional convolution block with a novel deeper block that fuses multi-layer features using dilated and residual convolution. In addition, the squeeze and execution attention mechanism (SE) and the s
... Show MoreThe study aimed to reach the best rating for the views and variables in the totals characterized by qualities and characteristics common within each group and distinguish them from aggregates other for the purpose of distinguishing between Iraqi provinces which suffer from deprivation, for the purpose of identifying the status of those provinces in the early allowing interested parties and regulators to intervene to take appropriate corrective action in a timely manner. Style has been used cluster analysis Cluster analysis to reach the best rating to those totals from the provinces that suffer from problems, where the provinces were classified, based on the variables (Edu
... Show MoreAbstract
The current research aims to identify the analysis of the questions for the book of literary criticism for the preparatory stage according to Bloom's classification. The research community consists of (34) exercises and (45) questions. The researcher used the method of analyzing questions and prepared a preliminary list that includes criteria that are supposed to measure exercises, which were selected based on Bloom's classification and the extant literature related to the topic. The scales were exposed to a jury of experts and specialists in curricula and methods of teaching the Arabic language. The scales obtained a complete agreement. Thus, it was adapted to become a reliable instrument in this
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreThis research deals with the attitude of oil press towards oil industry in the world and the extent of their concerns with the stages of oil industry relating to the abundance of oil and natural gas, as it is an international strategic and complementary industry. The researcher uses the survey method for content analysis of the initial article and the press news for two: years (2011-2012). The results if the study are as follows
1- Oil press is concerned with developing and the stages of the Arabic oil industry in the interest of OAPEC in the first place.
2- It is concerned with exploring, extracting, and marketing oil in the first place, then with refining operations in refineries and petrochemical plants in the second place, an
In this research a proposed technique is used to enhance the frame difference technique performance for extracting moving objects in video file. One of the most effective factors in performance dropping is noise existence, which may cause incorrect moving objects identification. Therefore it was necessary to find a way to diminish this noise effect. Traditional Average and Median spatial filters can be used to handle such situations. But here in this work the focus is on utilizing spectral domain through using Fourier and Wavelet transformations in order to decrease this noise effect. Experiments and statistical features (Entropy, Standard deviation) proved that these transformations can stand to overcome such problems in an elegant way.
... Show MoreIn this study, dynamic encryption techniques are explored as an image cipher method to generate S-boxes similar to AES S-boxes with the help of a private key belonging to the user and enable images to be encrypted or decrypted using S-boxes. This study consists of two stages: the dynamic generation of the S-box method and the encryption-decryption method. S-boxes should have a non-linear structure, and for this reason, K/DSA (Knutt Durstenfeld Shuffle Algorithm), which is one of the pseudo-random techniques, is used to generate S-boxes dynamically. The biggest advantage of this approach is the production of the inverted S-box with the S-box. Compared to the methods in the literature, the need to store the S-box is eliminated. Also, the fabr
... Show MoreSecured multimedia data has grown in importance over the last few decades to safeguard multimedia content from unwanted users. Generally speaking, a number of methods have been employed to hide important visual data from eavesdroppers, one of which is chaotic encryption. This review article will examine chaotic encryption methods currently in use, highlighting their benefits and drawbacks in terms of their applicability for picture security.
Change detection is a technology ascertaining the changes of
specific features within a certain time Interval. The use of remotely
sensed image to detect changes in land use and land cover is widely
preferred over other conventional survey techniques because this
method is very efficient for assessing the change or degrading trends
of a region. In this research two remotely sensed image of Baghdad
city gathered by landsat -7and landsat -8 ETM+ for two time period
2000 and 2014 have been used to detect the most important changes.
Registration and rectification the two original images are the first
preprocessing steps was applied in this paper. Change detection using
NDVI subtractive has been computed, subtrac