Preferred Language
Articles
/
ijp-329
Data visualization and distinct features extraction of the comet Ison 2013
...Show More Authors

The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Apr 12 2019
Journal Name
Journal Of Economics And Administrative Sciences
Accounting Mining Data Using Neural Networks (Case study)
...Show More Authors

Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Sep 01 2008
Journal Name
Al-khwarizmi Engineering Journal
New Adaptive Data Transmission Scheme Over HF Radio
...Show More Authors

Acceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.

An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo

... Show More
View Publication Preview PDF
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Thu Oct 01 2020
Journal Name
Bulletin Of Electrical Engineering And Informatics
Traffic management inside software-defined data centre networking
...Show More Authors

In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur

... Show More
View Publication
Scopus (17)
Crossref (13)
Scopus Crossref
Publication Date
Sat Sep 01 2018
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Director's leadership features and Impact on productive efficiency Study at Iraqi general insurance company: Study at Iraqi general insurance company
...Show More Authors

أدى التغير السريع في البيئة الخارجية للمنظمة إلى  ظهور حالة من التنافس الشديد مما زاد تخوف الشركات من فقدان الحصة السوقية والخسارة . مما حدا بالمنظمات إلى الاهتمام بوجود مدير يحمل صفات وخصائص قيادية لما فيه من ميزات في تنظيم الإنتاج  ومقابلة الطلب وتقليل التكاليف وتطوير الأداء للحصول على ميزة تنافسية تحافظ او تزيد من حصتها السوقية وإرباحها .

تسعى الدراسة الى تحديد عدد من الاهداف كان اهمها معرفة الع

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Aug 01 2016
Journal Name
2016 38th Annual International Conference Of The Ieee Engineering In Medicine And Biology Society (embc)
Myoelectric feature extraction using temporal-spatial descriptors for multifunction prosthetic hand control
...Show More Authors

View Publication
Scopus (11)
Crossref (9)
Scopus Crossref
Publication Date
Tue Nov 01 2022
Journal Name
2022 International Conference On Data Science And Intelligent Computing (icdsic)
An improved Bi-LSTM performance using Dt-WE for implicit aspect extraction
...Show More Authors

In aspect-based sentiment analysis ABSA, implicit aspects extraction is a fine-grained task aim for extracting the hidden aspect in the in-context meaning of the online reviews. Previous methods have shown that handcrafted rules interpolated in neural network architecture are a promising method for this task. In this work, we reduced the needs for the crafted rules that wastefully must be articulated for the new training domains or text data, instead proposing a new architecture relied on the multi-label neural learning. The key idea is to attain the semantic regularities of the explicit and implicit aspects using vectors of word embeddings and interpolate that as a front layer in the Bidirectional Long Short-Term Memory Bi-LSTM. First, we

... Show More
View Publication Preview PDF
Scopus (7)
Crossref (5)
Scopus Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Ieee Transactions On Artificial Intelligence
Recursive Multi-Signal Temporal Fusions With Attention Mechanism Improves EMG Feature Extraction
...Show More Authors

View Publication
Scopus (38)
Crossref (35)
Scopus Crossref
Publication Date
Sat May 31 2025
Journal Name
Iraqi Journal For Computers And Informatics
Discussion on techniques of data cleaning, user identification, and session identification phases of web usage mining from 2000 to 2022
...Show More Authors

The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.

View Publication Preview PDF
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref