Preferred Language
Articles
/
MxeEP48BVTCNdQwCLmYJ
A study on predicting crime rates through machine learning and data mining using text
...Show More Authors
Abstract<p>Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based on the percentage of an accuracy measure of the previous work, are surveyed and introduced, with the aim of producing a concise review of using these algorithms in crime prediction. It is expected that this review study will be helpful for presenting such techniques to crime researchers in addition to supporting future research to develop these techniques for crime analysis by presenting some crime definition, prediction systems challenges and classifications with a comparative study. It was proved though literature, that supervised learning approaches were used in more studies for crime prediction than other approaches, and Logistic Regression is the most powerful method in predicting crime.</p>
Scopus Clarivate Crossref
View Publication
Publication Date
Tue Dec 01 2020
Journal Name
Baghdad Science Journal
A Competitive Study Using UV and Ozone with H2O2 in Treatment of Oily Wastewater
...Show More Authors

          In this study, ultraviolet (UV), ozone techniques with hydrogen peroxide oxidant were used to treat the wastewater which is produced from South Baghdad Power Station using lab-scale system. From UV-H2O2 experiments, it was shown that the optimum exposure time was 80 min. At this time, the highest removal percentages of oil, COD, and TOC were 84.69 %, 56.33 % and 50 % respectively. Effect of pH on the contaminants removing was studied in the range of (2-12). The best oil, COD, and TOC removal percentages (69.38 %, 70 % and 52 %) using H2O2/UV were at pH=12. H2O2/ozone experiments exhibited better performance compared to

... Show More
View Publication Preview PDF
Scopus (11)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Thu Jun 20 2019
Journal Name
Baghdad Science Journal
An Optimised Method for Fetching and Transforming Survey Data based on SQL and R Programming Language
...Show More Authors

The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste

... Show More
View Publication Preview PDF
Crossref (1)
Clarivate Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Int. J. Agricult.
FORECASTING THE EXCHANGE RATES OF THE US DOLLAR AGAINST THE IRAQI DINAR USING THE BOX-JENKINS METHODOLOGY IN TIME SERIES WITH PRACTICAL APPLICATION
...Show More Authors

The goal of the study is to discover the best model for forecasting the exchange rate of the US dollar against the Iraqi dinar by analyzing time series using the Box Jenkis approach, which is one of the most significant subjects in the statistical sciences employed in the analysis. The exchange rate of the dollar is considered one of the most important determinants of the relative level of the health of the country's economy. It is considered the most watched, analyzed and manipulated measure by the government. There are factors affecting in determining the exchange rate, the most important of which are the amount of money, interest rate and local inflation global balance of payments. The data for the research that represents the exchange r

... Show More
Scopus
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Mar 01 2019
Journal Name
Spatial Statistics
Efficient Bayesian modeling of large lattice data using spectral properties of Laplacian matrix
...Show More Authors

Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati

... Show More
View Publication
Scopus (9)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Sun May 11 2025
Journal Name
Iraqi Statisticians Journal
Estimating General Linear Regression Model of Big Data by Using Multiple Test Technique
...Show More Authors

View Publication
Crossref
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
Analytical Approach for Load Capacity of Large Diameter Bored Piles Using Field Data
...Show More Authors

An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.

               Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95

... Show More
View Publication Preview PDF
Publication Date
Fri Feb 04 2022
Journal Name
Neuroquantology
Detecting Damaged Buildings on Post-Hurricane Satellite Imagery based on Transfer Learning
...Show More Authors

In this article, Convolution Neural Network (CNN) is used to detect damage and no damage images form satellite imagery using different classifiers. These classifiers are well-known models that are used with CNN to detect and classify images using a specific dataset. The dataset used belongs to the Huston hurricane that caused several damages in the nearby areas. In addition, a transfer learning property is used to store the knowledge (weights) and reuse it in the next task. Moreover, each applied classifier is used to detect the images from the dataset after it is split into training, testing and validation. Keras library is used to apply the CNN algorithm with each selected classifier to detect the images. Furthermore, the performa

... Show More
View Publication
Scopus (3)
Scopus Crossref