The investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutting-edge machine learning techniques, our methodology shows a notable improvement in the precision and effectiveness of well-log predictions. Standard well logs from a reference well were used to train machine learning models. Additionally, conventional wireline logs were used as input to estimate facies for unclassified wells lacking core data. R-squared analysis and goodness-of-fit tests provide a numerical assessment of model performance, strengthening the validation process. The multi-resolution graph-based clustering and similarity threshold approaches have demonstrated notable results, achieving an accuracy of nearly 98%. Applying these techniques to data from eighteen wells produced precise results, demonstrating the effectiveness of our approach in enhancing the reliability and quality of well-log production.
The role of the green areas lies in being one of the systems that plays the vital role in achieving the environmental dimension besides the socio-cultural body and the economic dimension in the hidden value of ecosystem services. However, many developing countries are characterized by a state of low community environmental awareness, which coincides with the basic need for land for housing and other uses, to take precedence over nature protection strategies. In the absence of clear planning and long-term planning strategies, all this led to abuses and violations of urban land use. In Iraq, the situation became more apparent due to the political, security and social conditions that followed the year 2003. Hence, the resea
... Show MoreAbstract The aim of this research is to show the grade of implementation of ISO 26000 (Social Responsibility Standard), specifically which related in clause sex (consumer issues), this study was achieved in Market Research and Consumer Protection Center (MRCPC) / University of Baghdad. The seven consumer issues of ISO 26000 was analyzed to show the extent of its implementation in MRCPC depending of using a check list as a principle instrument to collect research data and information. Results analysis was achieved by percentages and mean average. The research was leaded some of results and the most importance one was that the grade of implementation of the center in related to consumer issues given in the standard was medium
The aim of this research is to measure and analyze the gap between the actual reality and the requirements of the environmental management system in the middle refineries company/refinery cycle according to ISO14001: 2015, as well as to measure the availability of a clean production strategy and test the relationship and impact between the availability of the requirements of the standard and a clean production strategy for the actual reality in the company.
The research problem was determined by the extent to which the requirements of the environmental management system are applied according to ISO14001: 2015 in the middle refineries company? To what extent are the required clean production strategies ava
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Background : Xanthomatosis is a disease in which large tendon tumors can occur, especially in the Achilles tendon. This disease is a rare interesting orthopaedic condition.
Case Report:A case of a twenty eight year old girl patient with giant bilateral Achilles tendon xanthomas in which both tumors were resected.
There was no ulceration on the both sides. The patient was treated by total resection of the lesion and reconstruction using tendon transfer of the Peroneus brevis and Flexor hallusis longus. Postoperative treatment consisted of six weeks lower leg cast immobilization followed by partial weight bearing. After 4 months the patient was able to walk pain free without any difficultie
... Show MoreBackground : Xanthomatosis is a disease in which large tendon tumors can occur, especially in the Achilles tendon. This disease is a rare interesting orthopaedic condition. Case Report:A case of a twenty eight year old girl patient with giant bilateral Achilles tendon xanthomas in which both tumors were resected. There was no ulceration on the both sides. The patient was treated by total resection of the lesion and reconstruction using tendon transfer of the Peroneus brevis and Flexor hallusis longus. Postoperative treatment consisted of six weeks lower leg cast immobilization followed by partial weight bearing. After 4 months the patient was able to walk pain free without any difficulties. It has been suggested that total resection with au
... Show MoreIn the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show MoreABSTRACT:
The study aims at expounding the correlation and effect between the Human resource development strategy and Quality Municipality Service within a theoretical framework and a practical framework conducted at Directorate Of Municipalities in holy Karbala . The researcher found during a pilot study that there isn’t enough care paid by the Directorate Of Municipalities in developing its human resources using one strategy or a number of strategies and their effect on the Quality Municipality Service. Thus a number of research questions were set concerning the existence of clear perception in the Directorates Of Municipalities concerning the strategies of developing both the human resource an Qualit
... Show MoreThe study aimed to effect of speed and die holes diameter in the machine on feed pellets quality. In this study was measured pellet direct measurement (%), pellet lengths (%), pellet durability (%) and pellet water absorption (%). Three die speeds 280, 300, and 320 rpm, three diameters of die holes in the machine 3, 4, and 5 mm, have been used. The results showed that increasing the pellet die speeds from 280 to 300 then to 320 rpm led to a significant decrease in direct measurement, pellet durability, and pellet water absorption was increased, whereas it did not significantly affect the pellet lengths. Increasing the die holes diameter from 3 to 4 then to 5 mm led to a significant de