Preferred Language
Articles
/
bsj-1282
Study of contrast between satellite image data and ground data
...Show More Authors

Spot panchromatic satellite image had been employed to study and know the difference Between ground and satellite data( DN ,its values varies from 0-255) where it is necessary to convert these DN values to absolute radiance values through special equations ,later it converted to spectral reflectance values .In this study a monitoring of the environmental effect resulted from throwing the sewage drainages pollutants (industrial and home) into the Tigris river water in Mosul, was achieved, which have an effect mostly on physical characters specially color and turbidity which lead to the variation in Spectral Reflectance of the river water ,and it could be detected by using many remote sensing techniques. The contaminated areas within the water of the river which represents the difference in the reflectance values were isolated and signed, as well as the field estimations, which had been done by using spectrometer device, which gave an acceptable agreement with satellite data considering time difference between these estimations. satellite imagery analysis program ERDAS version 8.4 was used to determine the values of Spectral Reflectance in the satellite image. A geographic information systems through the ARC INFO has been used to draw photo map of the study area determined it specific sites of measuring the Reflectance, which represent areas that are near the sources of pollution and the other various regions along the river.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Jan 01 2016
Journal Name
Statistics And Its Interface
Search for risk haplotype segments with GWAS data by use of finite mixture models
...Show More Authors

The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled

... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Sat Jan 01 2022
Journal Name
Journal Of Petroleum Science And Engineering
Performance evaluation of analytical methods in linear flow data for hydraulically-fractured gas wells
...Show More Authors

View Publication
Scopus (11)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Sat Nov 22 2014
Journal Name
Indian Journal Of Physics
Comparison between shell model and self-consistent mean field calculations for ground charge density distributions and elastic form factors of 12C and 16O nuclei
...Show More Authors

View Publication
Scopus (4)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sat Nov 22 2014
Journal Name
Indian Journal Of Physics
Comparison between shell model and self-consistent mean field calculations for ground charge density distributions and elastic form factors of 12C and 16O nuclei
...Show More Authors

View Publication
Scopus (4)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Fri Oct 01 2010
Journal Name
2010 Ieee Symposium On Industrial Electronics And Applications (isiea)
Distributed t-way test suite data generation using exhaustive search method with map and reduce framework
...Show More Authors

View Publication
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (754)
Crossref (742)
Scopus Clarivate Crossref
Publication Date
Thu Mar 19 2026
Journal Name
Journal Of Petroleum Research And Studies
Improving the Reliability of Well Log Data Recorded in Oil and Gas Wells through Quality Control Approaches: A Case Study from a Southern Iraqi Field
...Show More Authors

Quality control of well logs has always been an important objective in reservoir studies because of the key role played by well logs as input data. This study aims to make a quality control on well logs data for two wells of Yamama formation in southern Iraqi field to ensuring and enhancing the measurement accuracy. In the beginning, the calibration data of before and after surveys are applied as initial evaluation for the quality of density log in well R-1. Then, depth matching is used to fit the depth of all logs in each well. After that, the comparison between the main and repeat sections is helped to check the repeatability. Finally, all uncorrected logs are environmentally corrected to remove the effects of the borehole conditi

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus