Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain method for multiple eliminations. The method is tested on a fake reflection event to authorize their validity, and applied to a real field X-profile 2D seismic data from southern Iraq. The results ensure the possibility of internal multiple types existing in the deep reflection data in Iraq and have to remove. So that the interpretation for the true reflectors be valid. The final processed stacked seismic data using normal move out- frequency-wavenumber domain technique shows good, clear, and sharp reflectors in comparison with the conventional normal move out stack data. Open-source Madagascar reproducible package is used for processing all steps of this study and the package is very efficient, accurate, and easy to implement normal move out, frequency-wavenumber domain, Dip-filter programs. The aim of the current study is to separate internal multiples and noise from the real 2D seismic data.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Many of the proposed methods introduce the perforated fin with the straight direction to improve the thermal performance of the heat sink. The innovative form of the perforated fin (with inclination angles) was considered. Present rectangular pin fins consist of elliptical perforations with two models and two cases. The signum function is used for modeling the opposite and the mutable approach of the heat transfer area. To find the general solution, the degenerate hypergeometric equation was used as a new derivative method and then solved by Kummer's series. Two validation methods (previous work and Ansys 16.0‐Steady State Thermal) are considered. The strong agreement of the validation results (0.3
Corrosion- induced damage in reinforced concrete structure such as bridges, parking garages, and buildings, and the related cost for maintaining them in a serviceable condition, is a source of major concern for the owners of these structures.
Fly ash produced from south Baghdad power plant with different concentrations (20, 25 and 30) % by weight from the cement ratio were used as a corrosion inhibitor as a weight ratio from the cement content.
The concrete batch ratio under study was (1:1.5:3) cement, sand and gravel respectively which is used in Iraq. All the raw materials used were locally manufactured.
Concrete slabs (250x250x70) mm dimensions were casted, using Poly-wood molds. Two steel bars were embedded in the central po
The present work utilizes polyacrylic acid beads (PAA) to remove Alizarin yellow R (AYR)] and Alizarin Red S (ARS) from its solution. The isotherms of adsorption were investigated and the factors that impact them, such as temperature, ionic strength effect, shaking effect, and wet PAA. The isotherms of adsorption of (ARS) were found obeys the Freundlich equation. The isotherms of adsorption of (AYR) were found obeys the Langmuir equation. At various temperatures, the adsorption process on (PAA) was investigated. According to our data, there is a positive correlation between the (ARS and AYR) adsorption on the PAA and temperature (Endothermic process). The computation of the thermodynamic functions (ΔH, ΔG, and ΔS) is based on the foregoi
... Show MoreIn this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending on the mean square error criteria in where the estimation methods that were used are (Generalized Least Squares, M Robust, and Laplace), and for different sizes of samples (20, 40, 60, 80, 100, 120). The M robust method is demonstrated the best metho
... Show MoreIn this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending
In this paper, game theory was used and applied to the transport sector in Iraq, as this sector includes two axes, the public transport axis and the second axis the private transport axis, as each of these axes includes several types of transport, namely (sea transport, air transport, land transport, transport by rail, port transport) and the travel and tourism sector, as public transport lacks this sector, as the competitive advantage matrix for the transport sector was formed and after applying the MinMax-MaxMin principle to the matrix in all its stages, it was found that there was an equilibrium point except for the last stage where the equilibrium point was not available Therefore, the use of the linear programming method was
... Show More