Preferred Language
Articles
/
bsj-6310
A Crime Data Analysis of Prediction Based on Classification Approaches
...Show More Authors

Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment.  Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Feb 01 2011
Journal Name
Iop Conference Series: Materials Science And Engineering
Contour extraction of echocardiographic images based on pre-processing
...Show More Authors

In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Fri May 17 2013
Journal Name
International Journal Of Computer Applications
Fast Lossless Compression of Medical Images based on Polynomial
...Show More Authors

In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.

View Publication Preview PDF
Crossref (7)
Crossref
Publication Date
Sat Jul 01 2023
Journal Name
International Journal Of Computing And Digital Systems
Human Identification Based on SIFT Features of Hand Image
...Show More Authors

View Publication
Scopus (3)
Scopus Crossref
Publication Date
Sun Dec 02 2012
Journal Name
Baghdad Science Journal
Synthesis and Characterization of Derivatives Based on 4, - Dimercaptobiphenyl
...Show More Authors

Starting from 4, - Dimercaptobiphenyl, a variety of phenolic Schiff bases (methylolic, etheric, epoxy) derivatives have been synthesized. All proposed structure were supported by FTIR, 1H-NMR, 13C-NMR Elemental analysis all analysis were performed in center of consultation in Jordan Universty.

View Publication Preview PDF
Crossref
Publication Date
Thu Apr 25 2019
Journal Name
Engineering And Technology Journal
Improvement of Harris Algorithm Based on Gaussian Scale Space
...Show More Authors

Features is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.

View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Thu Aug 29 2024
Journal Name
International Journal Of Sustainable Development And Planning
Exploring the Transformative Effects of GPS and Satellite Imagery on Urban Landscape Perceptions in Baghdad: A Mixed-Methods Analysis
...Show More Authors

View Publication
Scopus (2)
Scopus Crossref
Publication Date
Sun Mar 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Use the path analysis method to diagnoseFactors influencing the intentions of the employees of the University of Dhi Qar To introduce Internet behavior
...Show More Authors

The purpose of this study is to diagnose factors that effect Thi-Qar behavioral intention to use internet. A sample of (127) internet users of university staff was taken in the study and were analyzed by using path analyze . The study concluded that there is a set of affecting correlation. It was founded that exogenous variables (gender, income, perceived fun, perceived usefulness, Image, and ease of use) has significant effect on endogenous (behavioral intention) . The result of analysis indicated that image hopeful gained users comes first, ease of use secondly, perceived fan and perceived usefulness on (dependent variables (daily internet usage and diversity of internet usage. Implication of these result are discussed . the st

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jun 28 2024
Journal Name
Arab World English Journal
Cowboy as a Symbol of Individualism: A Pragmatic Analysis of Red Dead Redemption 2
...Show More Authors

As cultures are mainly divided into collectivistic and individualistic, members tend to emphasize, through communication, either their position as part of their group or their independence from the group. This emphasis is manifested in using the pragmatic concepts of positive politeness and negative politeness. The present study looks into the reflection of these two cultures in Rockstar’s renowned video game, Red Dead Redemption 2 (2018). It aims at identifying the two cultures as present in the game and showing their significance to its narrative. It fills the gap in the studies of language used within video games as well as its cultural reflections. The study addresses the following question: What are the positive and negative

... Show More
View Publication Preview PDF
Scopus Clarivate Crossref
Publication Date
Mon Mar 07 2022
Journal Name
Journal Of Educational And Psychological Researches
The Impact of Cognitive Behavior Program Based on Meichenbaum Theory in Reducing the Negative Emotional Sensitivity among the Intermediate Stage Students
...Show More Authors

The aim of this research is to construct a cognitive behavior program based on the theory of Meichenbaum in reducing the emotional sensitivity among Intermediate school students. To achieve the aims of the research, two hypotheses were formulated and the experimental design with equal groups was chosen. The population of research and its sample are determined. The test of negative emotional sensitivity, which is constructed by the researcher, was adopted. The test contains (20) items proved validity and reliability as a reliable test by presenting it to a group of arbitrators and experts in education and psychology. An educational program is constructed based on the theory of Meichenbaum. The test was applied to a sample of (60) second i

... Show More
View Publication Preview PDF
Publication Date
Tue Feb 01 2022
Journal Name
Int. J. Nonlinear Anal. Appl.
Computer-based plagiarism detection techniques: A comparative study
...Show More Authors

Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and

... Show More