Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based on the percentage of an accuracy measure of the previous work, are surveyed and introduced, with the aim of producing a concise review of using these algorithms in crime prediction. It is expected that this review study will be helpful for presenting such techniques to crime researchers in addition to supporting future research to develop these techniques for crime analysis by presenting some crime definition, prediction systems challenges and classifications with a comparative study. It was proved though literature, that supervised learning approaches were used in more studies for crime prediction than other approaches, and Logistic Regression is the most powerful method in predicting crime.
Deconstruction theory is a theory that appeared After construction theory, and it tends, through some key principles, to reach the purposive and the main meaning of the text by the means of different perspectives. In other words, deconstruction is a critical literary theory and a contemporary philosophical approach that work together to reach exact concept of the text, and this is achieved through reading and analyzing the text. Therefore, deconstruction has specified some principles so as to reach the exact meaning of the text through these different principles.
پێشەكی:
تیۆری هەڵوەشاندنەوە تیۆرێكە پاش بوونیادگەری سەریهەڵداوە و دەیەوێت لەڕ
... Show MoreThis research aims to identify the nutritional status of the individual's relationship to malnutrition using anthropometric measurements such as weight, height and impact on the values ??of the blood study sample included 200 male and 200 female of the inhabitants of the cities of Baghdad and Baquba reviewers Central Mahtbrat
Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreThis study has been accomplished by testing three different models to determine rocks type, pore throat radius, and flow units for Mishrif Formation in West Qurna oilfield in Southern Iraq based on Mishrif full diameter cores from 20 wells. The three models that were used in this study were Lucia rocks type classification, Winland plot was utilized to determine the pore throat radius depending on the mercury injection test (r35), and (FZI) concepts to identify flow units which enabled us to recognize the differences between Mishrif units in these three categories. The study of pore characteristics is very significant in reservoir evaluation. It controls the storage mechanism and reservoir fluid prope
Economic analysis plays a pivotal role in managerial decision-making processes. This analysis is predicated on deeply understanding economic forces and market factors influencing corporate strategies and decisions. This paper delves into the role of economic data analysis in managing small and medium-sized enterprises (SMEs) to make strategic decisions and enhance performance. The study underscores the significance of this approach and its impact on corporate outcomes. The research analyzes annual reports from three companies: Al-Mahfaza for Mobile and Internet Financial Payment and Settlement Services Company Limited, Al-Arab for Electronic Payment Company, and Iraq Electronic Gateway for Financial Services Company. The paper concl
... Show MoreAbstract
The aim of the current research is to prepare an integrated learning program based on mathematics standards for the next generation of the NYS and to investigate its impact on the development of the teaching performance of middle school mathematics teachers and the future thinking skills of their students. To achieve the objectives of the research, the researcher prepared a list of mathematics standards for the next generation, which were derived from a list of standards. He also prepared a list of the teaching competencies required for middle school mathematics teachers in light of the list of standards, as well as clarified the foundations of the training program and its objectives and the mathematical
... Show MoreArtificial intelligence (AI) offers significant benefits to biomedical research and academic writing. Nevertheless, using AI-powered writing aid tools has prompted worries about excessive dependence on these tools and their possible influence on writing proficiency. The current study aimed to explore the academic staff’s perspectives on the impact of AI on academic writing. This qualitative study incorporated in-person interviews with academic faculty members. The interviews were conducted in a semi-structured manner, using a predetermined interview guide consisting of open-ended questions. The interviews were done in person with the participants from May to November 2023. The data was analyzed using thematic analysis. Ten academics aged
... Show MoreIn this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MoreSupport vector machine (SVM) is a popular supervised learning algorithm based on margin maximization. It has a high training cost and does not scale well to a large number of data points. We propose a multiresolution algorithm MRH-SVM that trains SVM on a hierarchical data aggregation structure, which also serves as a common data input to other learning algorithms. The proposed algorithm learns SVM models using high-level data aggregates and only visits data aggregates at more detailed levels where support vectors reside. In addition to performance improvements, the algorithm has advantages such as the ability to handle data streams and datasets with imbalanced classes. Experimental results show significant performance improvements in compa
... Show More