This article describes how to predict different types of multiple reflections in pre-track seismic data. The characteristics of multiple reflections can be expressed as a combination of the characteristics of primary reflections. Multiple velocities always come in lower magnitude than the primaries, this is the base for separating them during Normal Move Out correction. The muting procedure is applied in Time-Velocity analysis domain. Semblance plot is used to diagnose multiples availability and judgment for muting dimensions. This processing procedure is used to eliminate internal multiples from real 2D seismic data from southern Iraq in two stages. The first is conventional Normal Move Out correction and velocity auto picking and stacking, and the second stage is muting. Many Common Depth Point gathers are tested to select the proper muting dimension, later on; the auto pick for the muted semblance is done for the whole 2D seismic data. The following step is to stack the Normal Move Out corrected data. Differences are calculated between the two stages of the process which greatly help to determine the eliminated multiple locations within the sedimentary secession. This will reduce the risk of interpreting these sequences as primary reflectors spatially within deep thin layers. Madagascar open source package is used in these processing steps. Madagascar open source package is very efficient, accurate, and easy to correct any part of the Python code used in the two stages of processing.
Using remote sensing technology and modeling methodologies to monitor changes in land surface temperature (LST) and urban heat islands (UHI) has become an essential reference for making decisions on sustainable land use. This study estimates LST and UHI in Salah al-din Province to contribute to land management, Urban planning, or climate resilience in the region; as a result of environmental changes in recent years, LANDSAT Satellite Imagery from 2014- 2024 was implemented to estimate the LST and UHI indexes in Salah al-din Province, ArcGIS 10.7 was use to calculate the indices, and The normalized mean vegetation index (NDVI) was calculated as it is closely related to extracting (LST
S a mples of compact magnesia and alumina were evaporated
using CO2-laser .The
Processed powders were characterized by electron microscopy
and both scanning and transmission electron microscope. The results
indicated that the particle size for both powders have reduced largely
to 0.003 nm and 0.07 nm for MgO and Al2O3, with increasing in
shape sphericity.
Microalgae have been increasingly used for wastewater treatment due to their capacity to assimilate nutrients. Samples of wastewater were taken from the Erbil wastewater channel near Dhahibha village in northern Iraq. The microalga Coelastrella sp. was used in three doses (0.2, 1, and 2g. l-1) in this experiment for 21 days, samples were periodically (every 3 days) analyzed for physicochemical parameters such as pH, EC, Phosphate, Nitrate, and BOD5, in addition to, Chlorophyll a concentration. Results showed that the highest dose 2g.l-1 was the most effective dose for removing nutrients, confirmed by significant differences (p≤0.05) between all doses. The highest removal percentage was
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The aesthetic contents of data visualization is one of the contemporary areas through which data scientists and designers have been able to link data to humans, and even after reaching successful attempts to model data visualization, it wasn't clear how that reveals how it contributed to choosing the aesthetic content as an input to humanize these models, so the goal of the current research is to use The analytical descriptive approach aims to identify the aesthetic contents in data visualization, which the researchers interpreted through pragmatic philosophy and Kantian philosophy, and analyze a sample of data visualization models to reveal the aesthetic entrances in them to explain how to humanize them. The two researchers reached seve
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show More