A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
This study investigates the feasibility of a mobile robot navigating and discovering its location in unknown environments, followed by the creation of maps of these navigated environments for future use. First, a real mobile robot named TurtleBot3 Burger was used to achieve the simultaneous localization and mapping (SLAM) technique for a complex environment with 12 obstacles of different sizes based on the Rviz library, which is built on the robot operating system (ROS) booted in Linux. It is possible to control the robot and perform this process remotely by using an Amazon Elastic Compute Cloud (Amazon EC2) instance service. Then, the map to the Amazon Simple Storage Service (Amazon S3) cloud was uploaded. This provides a database
... Show MorePotential data interpretation is significant for subsurface structure characterization. The current study is an attempt to explore the magnetic low lying between Najaf and Diwaniyah Cities, In central Iraq. It aims to understand the subsurface structures that may result from this anomaly and submit a better subsurface structural image of the region. The study area is situated in the transition zone, known as the Abu Jir Fault Zone. This tectonic boundary is an inherited basement weak zone extending towards the NW-SE direction. Gravity and magnetic data processing and enhancement techniques; Total Horizontal Gradient, Tilt Angle, Fast Sigmoid Edge Detection, Improved Logistic, and Theta Map filters highlight source boundaries and the
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreAbstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists
Water flooding is one of the most important methods used in enhanced production; it was a pioneer method in use, but the development of technology within the oil industry, takes this subject toward another form in the oil production and application in oil fields with all types of oils and oil reservoirs. Now days most of the injection wells directed from the vertical to re-entry of full horizontal wells in order to get full of horizontal wells advantages.
This paper describes the potential benefits for using of re-entry horizontal injection wells as well as combination of re –entry horizontal injection and production wells. Al Qurainat productive sector was selected for study, which is one of the four main productive sectors of Sout
This study presents determination of the paleostress magnitudes and orientation of Bekhme Structure in Shaqlawa area northeastern Iraq. Paleostress Analysis of slip-fault measurements is performed using Right dihedral, Lisle diagram and Mohr Circles methods. Depending on Mohr Circles, Bott law and vertical thickness, the magnitudes of the paleostress at the time of the tectonic activity were determined. Firstly, Georient Software was used to estimate the orientation of the paleostresses (σ1, σ2 and σ3). Secondly, using the rupture –friction law, taking into account depth of the overburden and the vertical stress (σv) was calculated to determine the magnitude of the paleostresses (σ1=4500 bars, σ2=1
... Show MoreThe liver of marsh harrier grossly appeared as large, bi-lobed organ divided into left and right lobes, which are approximately equal in size and not divided into secondary lobes. Histologically, the liver of marsh harrier was found to contain numerous lobules which are not well defined by the connective tissue of the septa except that surrounded the portal triads. The parenchyma of liver composed of irregular branching cords of hepatocytes organized in double rows alternating with tortures path sinusoids which are lined with flattened endothelial cells and large, irregular outlined kupffer cells. Hepatic cords arranged in a radial pattern around the central vein of the liver lobule while in a subscapsular region they run parallel to the ca
... Show More