Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain method for multiple eliminations. The method is tested on a fake reflection event to authorize their validity, and applied to a real field X-profile 2D seismic data from southern Iraq. The results ensure the possibility of internal multiple types existing in the deep reflection data in Iraq and have to remove. So that the interpretation for the true reflectors be valid. The final processed stacked seismic data using normal move out- frequency-wavenumber domain technique shows good, clear, and sharp reflectors in comparison with the conventional normal move out stack data. Open-source Madagascar reproducible package is used for processing all steps of this study and the package is very efficient, accurate, and easy to implement normal move out, frequency-wavenumber domain, Dip-filter programs. The aim of the current study is to separate internal multiples and noise from the real 2D seismic data.
The Adaptive Optics technique has been developed to obtain the correction of atmospheric seeing. The purpose of this study is to use the MATLAB program to investigate the performance of an AO system with the most recent AO simulation tools, Objected-Oriented Matlab Adaptive Optics (OOMAO). This was achieved by studying the variables that impact image quality correction, such as observation wavelength bands, atmospheric parameters, telescope parameters, deformable mirror parameters, wavefront sensor parameters, and noise parameters. The results presented a detailed analysis of the factors that influence the image correction process as well as the impact of the AO components on that process
Image segmentation can be defined as a cutting or segmenting process of the digital image into many useful points which are called segmentation, that includes image elements contribute with certain attributes different form Pixel that constitute other parts. Two phases were followed in image processing by the researcher in this paper. At the beginning, pre-processing image on images was made before the segmentation process through statistical confidence intervals that can be used for estimate of unknown remarks suggested by Acho & Buenestado in 2018. Then, the second phase includes image segmentation process by using "Bernsen's Thresholding Technique" in the first phase. The researcher drew a conclusion that in case of utilizing
... Show MoreFractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima
... Show MoreInformation security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-
... Show MoreThis paper proposed a theoretical treatment to study underwater wireless optical communications (UWOC) system with different modulation schemes by multiple input-multiple output (MIMO) technology in coastal water. MIMO technology provides high-speed data rates with longer distance link. This technique employed to assess the system by BER, Q. factor and data rate under coastal water types. The reliability of the system is examined by the techniques of 1Tx/1Rx, 2Tx/2Rx, 3Tx/3Rx and 4Tx/4Rx. The results shows the proposed technique by MIMO can get the better performance compared with the other techniques in terms of BER. Theoretical results were obtained to compare between PIN and APD
Three-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show MoreIn this paper, two types of iron oxide nanomaterial (Fe3O4) and nanocomposite (T-Fe3O4) were created from the bio-waste mass of tangerine peel. These two materials were utilized for adsorption tests to remove cefixime (CFX) from an aqueous solution. Before the adsorption application, both adsorbents have been characterized by various characterizations such as XRD, FTIR, VSM, TEM, and FESEM. The mesoporous nano-crystalline structure of Fe3O4 and T-Fe3O4 nanocomposite with less than 100-nm diameter is confirmed. The adsorption of the obtained adsorbents was evaluated for CFX removal by adjusting several operation parameters to optimize the removal. The optimal conditions for CFX removal were found to be an initial concentration of 40 and 50 m
... Show More