The investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutting-edge machine learning techniques, our methodology shows a notable improvement in the precision and effectiveness of well-log predictions. Standard well logs from a reference well were used to train machine learning models. Additionally, conventional wireline logs were used as input to estimate facies for unclassified wells lacking core data. R-squared analysis and goodness-of-fit tests provide a numerical assessment of model performance, strengthening the validation process. The multi-resolution graph-based clustering and similarity threshold approaches have demonstrated notable results, achieving an accuracy of nearly 98%. Applying these techniques to data from eighteen wells produced precise results, demonstrating the effectiveness of our approach in enhancing the reliability and quality of well-log production.
With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
One wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.
Cloud computing represents the most important shift in computing and information technology (IT). However, security and privacy remain the main obstacles to its widespread adoption. In this research we will review the security and privacy challenges that affect critical data in cloud computing and identify solutions that are used to address these challenges. Some questions that need answers are: (a) User access management, (b) Protect privacy of sensitive data, (c) Identity anonymity to protect the Identity of user and data file. To answer these questions, a systematic literature review was conducted and structured interview with several security experts working on cloud computing security to investigate the main objectives of propo
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreThis study aimed to analyze and measure the relationship between oil revenues and financial sustainability in Iraq, the study used the stylistic approach inductive and deductive approach. Accompanied by the use of quantitative and analytical style, which was based on two variables oil revenues and net general budget on annual data covered the period (1990-2013). Among the most important findings of the study contain the time-series variables study on the root of the unit and is not stable in the general level, and become stable after the use of mathematical processors to gain access to a stable by taking the first difference of natural Ogartm of the series. The way (Johnson) to a long-term relationship between oil revenues and ne
... Show MoreLaurylamine hydrochloride CH3(CH2)11 NH3 – Cl has been chosen from cationic surfactants to produce secondary oil using lab. model shown in fig. (1). The relationship between interfacial tension and (temperature, salinity and solution concentration) have been studied as shown in fig. (2, 3, 4) respectively. The optimum values of these three variables are taken (those values that give the lowest interfacial tension). Saturation, permeability and porosity are measured in the lab. The primary oil recovery was displaced by water injection until no more oil can be obtained, then laurylamine chloride is injected as a secondary oil recovery. The total oil recovery is 96.6% or 88.8% of the residual oil has been recovered by this technique as shown
... Show More