Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
This abstract focuses on the significance of wireless body area networks (WBANs) as a cutting-edge and self-governing technology, which has garnered substantial attention from researchers. The central challenge faced by WBANs revolves around upholding quality of service (QoS) within rapidly evolving sectors like healthcare. The intricate task of managing diverse traffic types with limited resources further compounds this challenge. Particularly in medical WBANs, the prioritization of vital data is crucial to ensure prompt delivery of critical information. Given the stringent requirements of these systems, any data loss or delays are untenable, necessitating the implementation of intelligent algorithms. These algorithms play a pivota
... Show MoreDuring COVID-19, wearing a mask was globally mandated in various workplaces, departments, and offices. New deep learning convolutional neural network (CNN) based classifications were proposed to increase the validation accuracy of face mask detection. This work introduces a face mask model that is able to recognize whether a person is wearing mask or not. The proposed model has two stages to detect and recognize the face mask; at the first stage, the Haar cascade detector is used to detect the face, while at the second stage, the proposed CNN model is used as a classification model that is built from scratch. The experiment was applied on masked faces (MAFA) dataset with images of 160x160 pixels size and RGB color. The model achieve
... Show MoreDuring COVID-19, wearing a mask was globally mandated in various workplaces, departments, and offices. New deep learning convolutional neural network (CNN) based classifications were proposed to increase the validation accuracy of face mask detection. This work introduces a face mask model that is able to recognize whether a person is wearing mask or not. The proposed model has two stages to detect and recognize the face mask; at the first stage, the Haar cascade detector is used to detect the face, while at the second stage, the proposed CNN model is used as a classification model that is built from scratch. The experiment was applied on masked faces (MAFA) dataset with images of 160x160 pixels size and RGB color. The model achieve
... Show MoreIn this research, a study is introduced on the effect of several environmental factors on the performance of an already constructed quality inspection system, which was designed using a transfer learning approach based on convolutional neural networks. The system comprised two sets of layers, transferred layers set from an already trained model (DenseNet121) and a custom classification layers set. It was designed to discriminate between damaged and undamaged helical gears according to the configuration of the gear regardless to its dimensions, and the model showed good performance discriminating between the two products at ideal conditions of high-resolution images.
So, this study aimed at testing the system performance at poor s
... Show MoreThe objective of this study is to apply Artificial Neural Network for heat transfer analysis of shell-and-tube heat exchangers widely used in power plants and refineries. Practical data was obtained by using industrial heat exchanger operating in power generation department of Dura refinery. The commonly used Back Propagation (BP) algorithm was used to train and test networks by divided the data to three samples (training, validation and testing data) to give more approach data with actual case. Inputs of the neural network include inlet water temperature, inlet air temperature and mass flow rate of air. Two outputs (exit water temperature to cooling tower and exit air temperature to second stage of air compressor) were taken in ANN.
... Show MoreDeep learning (DL) plays a significant role in several tasks, especially classification and prediction. Classification tasks can be efficiently achieved via convolutional neural networks (CNN) with a huge dataset, while recurrent neural networks (RNN) can perform prediction tasks due to their ability to remember time series data. In this paper, three models have been proposed to certify the evaluation track for classification and prediction tasks associated with four datasets (two for each task). These models are CNN and RNN, which include two models (Long Short Term Memory (LSTM)) and GRU (Gated Recurrent Unit). Each model is employed to work consequently over the two mentioned tasks to draw a road map of deep learning mod
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreIn the present work, pattern recognition is carried out by the contrast and relative variance of clouds. The K-mean clustering process is then applied to classify the cloud type; also, texture analysis being adopted to extract the textural features and using them in cloud classification process. The test image used in the classification process is the Meteosat-7 image for the D3 region.The K-mean method is adopted as an unsupervised classification. This method depends on the initial chosen seeds of cluster. Since, the initial seeds are chosen randomly, the user supply a set of means, or cluster centers in the n-dimensional space.The K-mean cluster has been applied on two bands (IR2 band) and (water vapour band).The textural analysis is used
... Show MoreThis study aims at discussing how gender differences might affect communication among people. For this purpose, several TV interviews are selected and examined on the discourse level. Developing a model of analysis ,is found that certain linguistics have been used by male speakers ,whereas different aspects have been utilized my female speakers like deictic expressions and lexical items of emotion and delicacy .
This study discusses the Critical Discourse Analysis of 2012 American Presidential Election Debate’. The researcher adopts a model proposed by Van Dijk’s (2006 d). Six ideological categories have been selected within the overall strategies of the ideological square are used. The categories are of three levels of discourse structure : (the meaning, the argumentation, and the rhetoric) .They have shown effective criteria for detecting the most disguised systems of racism and manipulation.
Based on the analysis, it can be concluded that the elite discourses of candidates contribute to the reproduction of domination, Orientalism, and Islamophobia. This can be appl
... Show More