A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreThe distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
Abstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists
Air pollution refers to the release of pollutants into the air that are detrimental to human health and the planet as a whole.In this research, the air pollutants concentration measurements such as Total Suspended Particles(TSP), Carbon Monoxides(CO),Carbon Dioxide (CO2) and meteorological parameters including temperature (T), relative humidity (RH) and wind speed & direction were conducted in Baghdad city by several stations measuring numbered (22) stations located in different regions, and were classified into (industrial, commercial and residential) stations. Using Arc-GIS program ( spatial Analyses), different maps have been prepared for the distribution of different pollutant
The Sonic Scanner is a multifunctional instrument designed to log wells, assess elastic characteristics, and support reservoir characterisation. Furthermore, it facilitates comprehension of rock mechanics, gas detection, and well positioning, while also furnishing data for geomechanical computations and sand management. The present work involved the application of the Sonic Scanner for both basic and advanced processing of oil-well-penetrating carbonate media. The study aimed to characterize the compressional, shear, Stoneley slowness, rock mechanical properties, and Shear anisotropy analysis of the formation. Except for intervals where significant washouts are encountered, the data quality of the Monopole, Dipole, and Stoneley modes is gen
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show More
Shear and compressional wave velocities, coupled with other petrophysical data, are vital in determining the dynamic modules magnitude in geomechanical studies and hydrocarbon reservoir characterization. But, due to field practices and high running cost, shear wave velocity may not available in all wells. In this paper, a statistical multivariate regression method is presented to predict the shear wave velocity for Khasib formation - Amara oil fields located in South- East of Iraq using well log compressional wave velocity, neutron porosity and density. The accuracy of the proposed correlation have been compared to other correlations. The results show that, the presented model provides accurate
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show More