Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the controlling parameter on the AVO analysis. AVO cross plots from the real pre-stack seismic data reveal AVO class IV (showing a negative intercept decreasing with offset). This result matches our modelled result of fluid substitution for the seismic synthetics. It is concluded that fluid substitution is the controlling parameter on the AVO analysis and therefore, the high amplitude anomaly on the seabed and the target horizon 9 is the result of changing the fluid content and the lithology along the target horizons. While changing the porosity has little effect on the amplitude variation with offset within the AVO cross plot. Finally, results from the wedge models show that a small change of thickness causes a change in the amplitude; however, this change in thickness gives a different AVO characteristic and a mismatch with the AVO result of the real 2D pre-stack seismic data. Therefore, a constant thin layer with changing fluids is more likely to be the cause of the high amplitude anomalies.
Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implement
Orthogonal polynomials and their moments serve as pivotal elements across various fields. Discrete Krawtchouk polynomials (DKraPs) are considered a versatile family of orthogonal polynomials and are widely used in different fields such as probability theory, signal processing, digital communications, and image processing. Various recurrence algorithms have been proposed so far to address the challenge of numerical instability for large values of orders and signal sizes. The computation of DKraP coefficients was typically computed using sequential algorithms, which are computationally extensive for large order values and polynomial sizes. To this end, this paper introduces a computationally efficient solution that utilizes the parall
... Show MoreIn this paper, we study and investigate the quark anti-quark interaction mechanism through the annihilation process. The production of photons in association with interaction quark and gluon in the annihilation process. We investigate the effect of critical temperature, strength coupling and photons energy in terms of the quantum chromodynamics model theory framework. We find that the use of large critical temperature Tc =134 allows us to dramatically increase the strength coupling of quarks interaction. Its sensitivity to decreasing in photons rate with respect to strength coupling estimates. We also discuss the effect of photons energy on the rate of the photon , such as energies in range (1.5 to 5 GeV).The photons rate increases
... Show MoreIn this study used three methods such as Williamson-hall, size-strain Plot, and Halder-Wagner to analysis x-ray diffraction lines to determine the crystallite size and the lattice strain of the nickel oxide nanoparticles and then compare the results of these methods with two other methods. The results were calculated for each of these methods to the crystallite size are (0.42554) nm, (1.04462) nm, and (3.60880) nm, and lattice strain are (0.56603), (1.11978), and (0.64606) respectively were compared with the result of Scherrer method (0.29598) nm,(0.34245),and the Modified Scherrer (0.97497). The difference in calculated results Observed for each of these methods in this study.
Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreObjective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreA multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show More