Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
Machine Learning (ML) algorithms are increasingly being utilized in the medical field to manage and diagnose diseases, leading to improved patient treatment and disease management. Several recent studies have found that Covid-19 patients have a higher incidence of blood clots, and understanding the pathological pathways that lead to blood clot formation (thrombogenesis) is critical. Current methods of reporting thrombogenesis-related fluid dynamic metrics for patient-specific anatomies are based on computational fluid dynamics (CFD) analysis, which can take weeks to months for a single patient. In this paper, we propose a ML-based method for rapid thrombogenesis prediction in the carotid artery of Covid-19 patients. Our proposed system aims
... Show MoreInformation about soil consolidation is essential in geotechnical design. Because of the time and expense involved in performing consolidation tests, equations are required to estimate compression index from soil index properties. Although many empirical equations concerning soil properties have been proposed, such equations may not be appropriate for local situations. The aim of this study is to investigate the consolidation and physical properties of the cohesive soil. Artificial Neural Network (ANN) has been adapted in this investigation to predict the compression index and compression ratio using basic index properties. One hundred and ninety five consolidation results for soils tested at different construction sites
... Show MoreThe paper aims to propose Teaching Learning based Optimization (TLBO) algorithm to solve 3-D packing problem in containers. The objective which can be presented in a mathematical model is optimizing the space usage in a container. Besides the interaction effect between students and teacher, this algorithm also observes the learning process between students in the classroom which does not need any control parameters. Thus, TLBO provides the teachers phase and students phase as its main updating process to find the best solution. More precisely, to validate the algorithm effectiveness, it was implemented in three sample cases. There was small data which had 5 size-types of items with 12 units, medium data which had 10 size-types of items w
... Show MoreThe concept of deficit in public budget becomes a chronic economic phenomenon in most of the world, whether the advanced countries or developing countries. Despite the difference in the visions of the economic schools to accept or reject the deficit in public budget but the opinion that prevailed is the necessity of the state to reduce the public spending which led to a continuous deficits in the public budget which consequently increased the government borrowing ,increase income taxes and wealth, consequently this weakened the in motivation in private investment which contributed to the increase of in factionary stagnation , so that governments have to cover the lack of local funding sources which become difficult to be eq
... Show MoreTime series is an important statistical method adopted in the analysis of phenomena, practices, and events in all areas during specific time periods and predict future values contribute to give a rough estimate of the status of the study, so the study aimed to adopt the ARIMA models to forecast the volume of cargo handled and achieved in four ports (Umm Qasr Port, Khor Al Zubair Port, Abu Flus Port, and Maqal Port(, Monthly data on the volume of cargo handled for the years (2006-2018) were collected (156) observations. The study found that the most efficient model is ARIMA (1,1,1).
The volume of go
... Show MoreThe provided research paper offers a thorough analysis of the semiotic analysis present in tobacco-free initiative advertisements from the year 2021. The study delves into the intricate process of decoding the diverse signs, symbols, and visual components integrated into these anti-smoking campaigns. The core aim of this investigation is to comprehend and explore the semiotic tactics that underlie these advertisements, with a particular emphasis on visual communication as a pivotal tool in shaping the public's attitudes and behaviors towards tobacco usage. The research introduces a significant theoretical framework, the "Taxonomy of Image-Text Relations and Functions" theory, as proposed by Emily E. Marsh and Marilyn Dom
... Show MoreThe Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More