Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based on the percentage of an accuracy measure of the previous work, are surveyed and introduced, with the aim of producing a concise review of using these algorithms in crime prediction. It is expected that this review study will be helpful for presenting such techniques to crime researchers in addition to supporting future research to develop these techniques for crime analysis by presenting some crime definition, prediction systems challenges and classifications with a comparative study. It was proved though literature, that supervised learning approaches were used in more studies for crime prediction than other approaches, and Logistic Regression is the most powerful method in predicting crime.
The emergence of COVID-19 has resulted in an unprecedented escalation in different aspects of human activities, including medical education. Students and educators across academic institutions have confronted various challenges in following the guidelines of protection against the disease on one hand and accomplishing learning curricula on the other hand. In this short view, we presented our experience in implementing e-learning to the undergraduate nursing students during the present COVID-19 pandemic emphasizing the learning content, barriers, and feedback of students and educators. We hope that this view will trigger the preparedness of nursing faculties in Iraq to deal with this new modality of learning and improve it should t
... Show MoreIn present work an investigation for precise hole drilling via continuous wave (CW) CO2 laser at 150 W maximum output power and wavelength 10.6 μm was achieved with the assistance of computerized numerical controlled (CNC) machine and assist gases. The drilling process was done for thin sheets (0.1 – 0.3 mm) of two types of metals; stainless steel (sst) 321H, steel 33 (st). Changing light and process parameters such as laser power, exposure time and gas pressure was important for getting the optimum results. The obtained results were supported with computational results using the COMSOL 3.5a software code.
The implementation of technology in the provision of public services and communication to citizens, which is commonly referred to as e-government, has brought multitude of benefits, including enhanced efficiency, accessibility, and transparency. Nevertheless, this approach also presents particular security concerns, such as cyber threats, data breaches, and access control. One technology that can aid in mitigating the effects of security vulnerabilities within e-government is permissioned blockchain. This work examines the performance of the hyperledger fabric private blockchain under high transaction loads by analyzing two scenarios that involve six organizations as case studies. Several parameters, such as transaction send ra
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
To maintain a sustained competitive position in the contemporary environment of knowledge economy, organizations as an open social systems must have an ability to learn and know how to adapt to rapid changes in a proper fashion so that organizational objectives will be achieved efficiently and effectively. A multilevel approach is adopted proposing that organizational learning suffers from the lack of interest about the strategic competitive performance of the organization. This remains implicit almost in all models of organizational learning and there is little focus on how learning organizations achieve sustainable competitive advantage . A dynamic model that captures t
... Show MoreWith the fast-growing of neural machine translation (NMT), there is still a lack of insight into the performance of these models on semantically and culturally rich texts, especially between linguistically distant languages like Arabic and English. In this paper, we investigate the performance of two state-of-the-art AI translation systems (ChatGPT, DeepSeek) when translating Arabic texts to English in three different genres: journalistic, literary, and technical. The study utilizes a mixed-method evaluation methodology based on a balanced corpus of 60 Arabic source texts from the three genres. Objective measures, including BLEU and TER, and subjective evaluations from human translators were employed to determine the semantic, contextual an
... Show MoreThere are many varied studies that dealt with the dramatic construction, especially books and studies that addressed drama in its construction and the method of writing it, that no textbook or a general cultural content is void of tackling the dramatic text in its construction and how the dramatic action develops in it. Therefore, a question occurs to the mind about the feasibility of dealing with the dramatic construction in this time, where many contemporary studies of dramatology and its relation and the contemporary critical directions are accumulating. This question many have two realistic aspects, yet the novelty and originality that this research shows lie in addressing a refined linguistic text in its style and connotations, such
... Show MoreThe purpose of this study is to investigate the research on artificial intelligence algorithms in football, specifically in relation to player performance prediction and injury prevention. To accomplish this goal, scholarly resources including Google Scholar, ResearchGate, Springer, and Scopus were used to provide a systematic examination of research done during the last ten years (2015–2025). Through a systematic procedure that included data collection, study selection based on predetermined criteria, categorisation based on AI applications in football, and assessment of major research problems, trends, and prospects, almost fifty papers were found and analysed. Summarising AI applications in football for performance and injury p
... Show More