Preferred Language
Articles
/
jperc-1019
Comparison between Rush Model Parameters to Completed and Lost Data by Different Methods of Processing Missing Data
...Show More Authors

The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition, the researcher relied on chi-squared value for each item at (0.05).  After that, the researcher found out the parameters of the missing data after relying on a loss percentage (10%) and used three ways to treat them (mean, regression, likelihood). The results showed that the comparison between the parameters completed and missing data by using three ways of processing the missing data is in favor of the parameters of the completed data, and the likelihood way is the suitable way to treat the completed data. 

     The conclusions, recommendations and suggestions have been drawn based on the findings.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Dec 31 2025
Journal Name
University Of Kirkuk Journal For Administrative And Economic Science
Anova For Fuzzy Data With Practical in The Medical Field
...Show More Authors

This research study Blur groups (Fuzzy Sets) which is the perception of the most modern in the application in various practical and theoretical areas and in various fields of life, was addressed to the fuzzy random variable whose value is not real, but the numbers Millbh because it expresses the mysterious phenomena or uncertain with measurements are not assertive. Fuzzy data were presented for binocular test and analysis of variance method of random Fuzzy variables , where this method depends on a number of assumptions, which is a problem that prevents the use of this method in the case of non-realized.

View Publication Preview PDF
Publication Date
Fri Apr 01 2016
Journal Name
Gis Research Uk 24th Annual Conference
Comparing Open Source Map Data in Areas Lacking Authoritative Mapping
...Show More Authors

One wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.

Publication Date
Wed Oct 01 2008
Journal Name
2008 First International Conference On Distributed Framework And Applications
A strategy for Grid based t-way test data generation
...Show More Authors

View Publication
Scopus (22)
Crossref (17)
Scopus Crossref
Publication Date
Thu Mar 17 2016
Journal Name
International Journal Of Computer Applications
Analysis of Wind Speed Data and Annual Energy Potential at Three locations in Iraq
...Show More Authors

View Publication
Crossref (2)
Crossref
Publication Date
Thu Mar 29 2018
Journal Name
Construction Research Congress 2018
Validation of Time-Safety Influence Curve Using Empirical Safety and Injury Data—Poisson Regression
...Show More Authors

View Publication
Scopus (6)
Crossref (5)
Scopus Crossref
Publication Date
Sat Nov 22 2014
Journal Name
Indian Journal Of Physics
Comparison between shell model and self-consistent mean field calculations for ground charge density distributions and elastic form factors of 12C and 16O nuclei
...Show More Authors

View Publication
Scopus (4)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sat Nov 22 2014
Journal Name
Indian Journal Of Physics
Comparison between shell model and self-consistent mean field calculations for ground charge density distributions and elastic form factors of 12C and 16O nuclei
...Show More Authors

View Publication
Scopus (4)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Nonlinear Analysis And Applications
Big data analysis by using one covariate at a time multiple testing (Ocmt) method: Early school dropout in iraq
...Show More Authors

Scopus (4)
Scopus
Publication Date
Mon Apr 11 2011
Journal Name
Icgst
Employing Neural Network and Naive Bayesian Classifier in Mining Data for Car Evaluation
...Show More Authors

In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.

Publication Date
Wed Jul 01 2020
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Fast and robust approach for data security in communication channel using pascal matrix
...Show More Authors

This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Crossref