Predicting vertical stress was indeed useful for controlling geomechanical issues since it allowed for the computation of pore pressure for the formation and the classification of fault regimes. This study provides an in-depth observation of vertical stress prediction utilizing numerous approaches using the Techlog 2015 software. Gardner's method results in incorrect vertical stress values with a problem that this method doesn't start from the surface and instead relies only on sound log data. Whereas the Amoco, Wendt non-acoustic, Traugott, average technique simply needed density log as input and used a straight line as the observed density, this was incorrect for vertical computing stress. The results of these methods show that extrapolated density measurement used an average for the real density. The gradient of an extrapolated method is much better in shallow depth into the vertical stress calculations. The Miller density method had an excellent fit with the real density in deep depth. It has been crucial to calculate vertical stress for the past 40 years because calculating pore pressure and geomechanical building models have employed vertical stress as input. The strongest predictor of vertical stress may have been bulk density. According to these results, the miller and extrapolated techniques may be the best two methods for determining vertical stress. Still, the gradient of an extrapolated method is much more excellent in shallow depth than the miller method. Extrapolated density approach may produce satisfactory results for vertical stress, while miller values are lower than those obtained by extrapolating. This may be due to the poor gradient of this method at shallow depths. Gardner's approach incorrectly displays minimum values of about 4000 psi at great depths. While other methods provide numbers that are similar because these methods use constant bulk density values that start at the surface and continue to the desired depth, this is incorrect.
Under aerobic and anaerobic conditions, two laboratory-scale reactors were operated. Each reactor
was packed with 8.5 kg of shredded synthetic solid waste (less than 5 cm) that was prepared according to an
average composition of domestic solid waste in the city of Kirkuk. Using an air compressor, aerobic
conditions were created in the aerobic reactor. This study shows that the aerobic reactor was more efficient in
COD and BOD5 removal which were 97.88% and 91.25% while in case of anaerobic reactor, they were
66.53%and 19.11%, respectively.
The study investigates the water quality of the Orontes River, which is considered one of the important water recourses in Syria, as it is used for drinking, irrigation, swimming and industrial needs. A database of 660 measurements for 13 parameters concentrations used, were taken from 11 monitoring points distributed along the Orontes River for a period of five years from 2015-2019, and to study the correlation between parameters and their impact on water quality, statistical analysis was applied using (SPSS) program. Cluster analysis was applied in order to classify the pollution areas along the river, and two groups were given: (low pollution - high pollution), where the areas were classified according to the sources of pollution to w
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreTwo EM techniques, terrain conductivity and VLF-Radiohm resistivity (using two
different instruments of Geonics EM 34-3 and EMI6R respectively) have been applied to
evaluate their ability in delineation and measuring the depth of shallow subsurface cavities
near Haditha city.
Thirty one survey traverses were achieved to distinguish the subsurface cavities in the
investigated area. Both EM techniques are found to be successfiul tools in study area.
The research dealt with the issue of strategic information systems and its impact on the global marketing channel. As the research aims to know the strategic information systems used in the cement company. Which support the senior management in supporting decisions taken in the process of global orientation and moving to foreign markets by choosing the marketing channel that will suit the company. Product and competition with it. With other companies' products. The problem of the study focused on how to move to global markets, and any marketing channel that can be followed to move the cement product globally, what are the strategic information systems used in the company and how will it contribute to supporting senior management dec
... Show MoreLattakia city faces many problems related to the mismanagement of solid waste, as the disposal process is limited to the random Al-Bassa landfill without treatment. Therefore, solid waste management poses a special challenge to decision-makers by choosing the appropriate tool that supports strategic decisions in choosing municipal solid waste treatment methods and evaluating their management systems. As the human is primarily responsible for the formation of waste, this study aims to measure the degree of environmental awareness in the Lattakia Governorate from the point of view of the research sample members and to discuss the effect of the studied variables (place of residence, educational level, gender, age, and professional status) o
... Show MoreBackground:This is a prospective study of three children presented to us in the Orbital clinic in AL ShahidGazi Al Hariri Hospital with painless proptosiswith suspension of Hydatid disease.Objectives: : Orbital hydatid disease is a rare lesion accounting for less than 1% of the total lesions of the body (1, 2). Orbital cysts presented as a primary lesion in our study which is rare to have such lesion without involvement of other organs (3). Humans represent the intermediate host where the commonly affected organ are liver and the lung (10-15%) (4). Methods:This is a prospective study of three Children presented to us in the Orbital clinic in Al Shahid Ghazi Alhariri Hospital with painless proptosis with suspension of Hydatid disease, dep
... Show MoreIn this study, two active galaxies (NGC4725, NGC4639) have been chosen to study their morphological and photometric properties, by using the IRAF ISOPHOTE ELLIPS task with griz-filters. Observations are obtained from the Sloan Digital Sky Survey (SDSS) which reaches now to the DATA Release (DR14). The data reduction of all images (bias and flat field) has been done by SDSS Pipeline. The surface photometric investigation was performed like the magnitude. Together with isophotal contour maps, surface brightness profiles and a bulge/disk decomposition of the images of the galaxies, although the disk position angle, ellipticity, and inclination of the galaxies have been done. Also, the color of galaxies was studied, where chromatic distribution
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThe aim of the research is to investigate potential effects of the finance industry and block-chain to general business of financing in particular, as well as its shortcomings and difficulties. To answer the research questions, the researcher used the objective narrative-analytical descriptive approach and included a qualitative analysis of Blockchain technology. The process of Blockchain technology based on their industries, the authors were selected based on their reputation in the Blockchain field. The research found that Blockchain can improve the efficiency of the banking industry's various sections. It has the ability to upgrade and transfer wages across borders, financial reporting and compliance, as well as trade finance
... Show More