Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
Starting from 4, - Dimercaptobiphenyl, a variety of phenolic Schiff bases (methylolic, etheric, epoxy) derivatives have been synthesized. All proposed structure were supported by FTIR, 1H-NMR, 13C-NMR Elemental analysis all analysis were performed in center of consultation in Jordan Universty.
Features is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.
The purpose of this study is to diagnose factors that effect Thi-Qar behavioral intention to use internet. A sample of (127) internet users of university staff was taken in the study and were analyzed by using path analyze . The study concluded that there is a set of affecting correlation. It was founded that exogenous variables (gender, income, perceived fun, perceived usefulness, Image, and ease of use) has significant effect on endogenous (behavioral intention) . The result of analysis indicated that image hopeful gained users comes first, ease of use secondly, perceived fan and perceived usefulness on (dependent variables (daily internet usage and diversity of internet usage. Implication of these result are discussed . the st
... Show MoreAs cultures are mainly divided into collectivistic and individualistic, members tend to emphasize, through communication, either their position as part of their group or their independence from the group. This emphasis is manifested in using the pragmatic concepts of positive politeness and negative politeness. The present study looks into the reflection of these two cultures in Rockstar’s renowned video game, Red Dead Redemption 2 (2018). It aims at identifying the two cultures as present in the game and showing their significance to its narrative. It fills the gap in the studies of language used within video games as well as its cultural reflections. The study addresses the following question: What are the positive and negative
... Show MoreThe aim of this research is to construct a cognitive behavior program based on the theory of Meichenbaum in reducing the emotional sensitivity among Intermediate school students. To achieve the aims of the research, two hypotheses were formulated and the experimental design with equal groups was chosen. The population of research and its sample are determined. The test of negative emotional sensitivity, which is constructed by the researcher, was adopted. The test contains (20) items proved validity and reliability as a reliable test by presenting it to a group of arbitrators and experts in education and psychology. An educational program is constructed based on the theory of Meichenbaum. The test was applied to a sample of (60) second i
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show More