Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the controlling parameter on the AVO analysis. AVO cross plots from the real pre-stack seismic data reveal AVO class IV (showing a negative intercept decreasing with offset). This result matches our modelled result of fluid substitution for the seismic synthetics. It is concluded that fluid substitution is the controlling parameter on the AVO analysis and therefore, the high amplitude anomaly on the seabed and the target horizon 9 is the result of changing the fluid content and the lithology along the target horizons. While changing the porosity has little effect on the amplitude variation with offset within the AVO cross plot. Finally, results from the wedge models show that a small change of thickness causes a change in the amplitude; however, this change in thickness gives a different AVO characteristic and a mismatch with the AVO result of the real 2D pre-stack seismic data. Therefore, a constant thin layer with changing fluids is more likely to be the cause of the high amplitude anomalies.
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreBioethanol produced from lignocellulose feedstock is a renewable substitute to declining fossil fuels. Pretreatment using ultrasound assisted alkaline was investigated to enhance the enzyme digestibility of waste paper. The pretreatment was conducted over a wide range of conditions including waste paper concentrations of 1-5%, reaction time of 10-30 min and temperatures of 30-70°C. The optimum conditions were 4 % substrate loading with 25 min treatment time at 60°C where maximum reducing sugar obtained was 1.89 g/L. Hydrolysis process was conducted with a crude cellulolytic enzymes produced by Cellulomonas uda (PTCC 1259).The maximum amount of sugar released and hydrolysis efficiency were 20.92 g/L and 78.4 %, respectively. Sugars
... Show MoreIn this study, gold nanoparticles were synthesized in a single step biosynthetic method using aqueous leaves extract of thymus vulgaris L. It acts as a reducing and capping agent. The characterizations of nanoparticles were carried out using UV-Visible spectra, X-ray diffraction (XRD) and FTIR. The surface plasmon resonance of the as-prepared gold nanoparticles (GNPs) showed the surface plasmon resonance centered at 550[Formula: see text]nm. The XRD pattern showed that the strong four intense peaks indicated the crystalline nature and the face centered cubic structure of the gold nanoparticles. The average crystallite size of the AuNPs was 14.93[Formula: see text]nm. Field emission scanning electron microscope (FESEM) was used to s
... Show MoreThe statistical distributions study aimed to obtain on best descriptions of variable sets phenomena, which each of them got one behavior of that distributions . The estimation operations study for that distributions considered of important things which could n't canceled in variable behavior study, as result this research came as trial for reaching to best method for information distribution estimation which is generalized linear failure rate distribution, throughout studying the theoretical sides by depending on statistical posteriori methods like greatest ability, minimum squares method and Mixing method (suggested method).
The research
... Show MoreThis study aimed at investigating the effect of using computer in
Efficiency of Training Programme of Science Teachers in Ajloun District in
Jordan.
1- What is the effect of using computer in program for the two groups
2- ( the experimental and control group ) .
3- Are there any statistics different in the effect of using computer
program for the two groups ?
4- Are there any statistics (comparison ) or different of the effect of the
effect of using computer program refer to the sex (male or female )?
The community of the study consisted of all the science student in
educational directorate of Ajloun district for the academic year 2009 –
2010, they are (120) ( male and female) . The sample of the study<
Steganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show MoreThirty local fungal isolates according to Aspergillus niger were screened for Inulinase production on synthetic solid medium depending on inulin hydrolysis appear as clear zone around fungal colony. Semi-quantitative screening was performed to select the most efficient isolate for inulinase production. the most efficient isolate was AN20. The optimum condition for enzyme production from A. niger isolate was determined by busing a medium composed of sugar cane moisten with corn steep liquor 5;5 (v/w) at initial pH 5.0 for 96 hours at 30 0C . Enzyme productivity was tested for each of the yeast Kluyveromyces marxianus, the fungus A. niger AN20 and for a mixed culture of A. niger and K. marxianus. The productivity of A. niger gave the highest
... Show MoreThis study includes analytical methods for the determination of the drug amoxicillin trihydrate (Amox.) in some pharmaceutical preparations using Cobalt ion (Co(II)) as complexing metal. The best conditions for complexation were: the reaction time was 20 minutes, pH=1.5 and the best temperature of reaction was 70 ËšC. Benzyl alcohol was the best solvent for extraction the complex.
Keywords: Amoxicillin, Cobalt(II), Complex, Molar ratio.
Decision-making in Operations Research is the main point in various studies in our real-life applications. However, these different studies focus on this topic. One drawback some of their studies are restricted and have not addressed the nature of values in terms of imprecise data (ID). This paper thus deals with two contributions. First, decreasing the total costs by classifying subsets of costs. Second, improving the optimality solution by the Hungarian assignment approach. This newly proposed method is called fuzzy sub-Triangular form (FS-TF) under ID. The results obtained are exquisite as compared with previous methods including, robust ranking technique, arithmetic operations, magnitude ranking method and centroid ranking method. This
... Show More