Preferred Language
Articles
/
lBfqPY8BVTCNdQwClWRF
Anomaly Detection in Flight Data Using the Naïve Bayes Classifier
...Show More Authors

Scopus Crossref
View Publication
Publication Date
Sun Feb 10 2019
Journal Name
Journal Of The College Of Education For Women
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Publication Date
Sat Aug 01 2015
Journal Name
2015 Ieee Conference On Computational Intelligence In Bioinformatics And Computational Biology (cibcb)
Granular computing approach for the design of medical data classification systems
...Show More Authors

View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Tue Mar 01 2022
Journal Name
International Journal Of Nonlinear Analysis And Applications
The suggested threshold to reduce data noise for a factorial experiment
...Show More Authors

In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b

... Show More
Publication Date
Mon Feb 18 2019
Journal Name
Iraqi Journal Of Physics
Data visualization and distinct features extraction of the comet Ison 2013
...Show More Authors

The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.

View Publication Preview PDF
Crossref
Publication Date
Tue Jan 03 2023
Journal Name
College Of Islamic Sciences
Ruling on selling big data (Authentical Fiqh Study): Ruling on selling big data (Authentical Fiqh Study)
...Show More Authors

Abstract:

Research Topic: Ruling on the sale of big data

Its objectives: a statement of what it is, importance, source and governance.

The methodology of the curriculum is inductive, comparative and critical

One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it

 Recommendation: Follow-up of studies dealing with the provisions of the issue

Subject Terms

Judgment, Sale, Data, Mega, Sayings, Jurists

 

View Publication Preview PDF
Publication Date
Sun Mar 01 2020
Journal Name
Iop Conference Series: Materials Science And Engineering
Prepare Maps For Greenhouse Gases With Some Weather Elements For Baghdad City Using Data Observation And Arc-GIS Techniques
...Show More Authors
Abstract<p>Air pollution refers to the release of pollutants into the air that are detrimental to human health and the planet as a whole.In this research, the air pollutants concentration measurements such as Total Suspended Particles(TSP), Carbon Monoxides(CO),Carbon Dioxide (CO2) and meteorological parameters including temperature (T), relative humidity (RH) and wind speed & direction were conducted in Baghdad city by several stations measuring numbered (22) stations located in different regions, and were classified into (industrial, commercial and residential) stations. Using Arc-GIS program ( spatial Analyses), different maps have been prepared for the distribution of different pollutant</p> ... Show More
View Publication Preview PDF
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Tue Dec 09 2025
Journal Name
Petroleum And Coal
Extraction and Analysis of Compressional, Shear, and Stoneley Slowness, Rock Mechanical Properties, and Shear Anisotropy Using Sonic Scanner Data
...Show More Authors

The Sonic Scanner is a multifunctional instrument designed to log wells, assess elastic characteristics, and support reservoir characterisation. Furthermore, it facilitates comprehension of rock mechanics, gas detection, and well positioning, while also furnishing data for geomechanical computations and sand management. The present work involved the application of the Sonic Scanner for both basic and advanced processing of oil-well-penetrating carbonate media. The study aimed to characterize the compressional, shear, Stoneley slowness, rock mechanical properties, and Shear anisotropy analysis of the formation. Except for intervals where significant washouts are encountered, the data quality of the Monopole, Dipole, and Stoneley modes is gen

... Show More
View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
DYNAMIC MODELING FOR DISCRETE SURVIVAL DATA BY USING ARTIFICIAL NEURAL NETWORKS AND ITERATIVELY WEIGHTED KALMAN FILTER SMOOTHING WITH COMPARISON
...Show More Authors

Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re

... Show More
Preview PDF
Scopus (1)
Scopus
Publication Date
Mon Jun 01 2020
Journal Name
Journal Of Engineering
An An Accurate Estimation of Shear Wave Velocity Using Well Logging Data for Khasib Carbonate Reservoir - Amara Oil Field
...Show More Authors

   

Shear and compressional wave velocities, coupled with other petrophysical data, are vital in determining the dynamic modules magnitude in geomechanical studies and hydrocarbon reservoir characterization. But, due to field practices and high running cost, shear wave velocity may not available in all wells. In this paper, a statistical multivariate regression method is presented to predict the shear wave velocity for Khasib formation - Amara oil fields located in South- East of Iraq using well log compressional wave velocity, neutron porosity and density. The accuracy of the proposed correlation have been compared to other correlations. The results show that, the presented model provides accurate

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Nov 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Implementation of new Secure Mechanism for Data Deduplication in Hybrid Cloud
...Show More Authors

Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of

... Show More
View Publication Preview PDF