With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Leveraging sophisticated AI algorithms, the study focuses on scrutinizingsubtle periodic patterns and uncovering relationships among the collected datasets. Through thiscomprehensive analysis, the research endeavors to pinpoint crime hotspots, detect fluctuations infrequency, and identify underlying causes of criminal activities. Furthermore, the research evaluates theefficacy of the AI model in generating productive insights and providing the most accurate predictionsof future criminal trends. These predictive insights are poised to revolutionize the strategies of lawenforcement agencies, enabling them to adopt proactive and targeted approaches. Emphasizing ethicalconsiderations, this research ensures the continued feasibility of AI use while safeguarding individuals'constitutional rights, including privacy. The anticipated outcomes of this research are anticipated tofurnish actionable intelligence for law enforcement, policymakers, and urban planners, aiding in theidentification of effective crime prevention strategies. By harnessing the potential of AI, this researchcontributes to the promotion of proactive strategies and data-driven models in crime analysis andprediction, offering a promising avenue for enhancing public security in Los Angeles and othermetropolitan areas.
<p>Combating the COVID-19 epidemic has emerged as one of the most promising healthcare the world's challenges have ever seen. COVID-19 cases must be accurately and quickly diagnosed to receive proper medical treatment and limit the pandemic. Imaging approaches for chest radiography have been proven in order to be more successful in detecting coronavirus than the (RT-PCR) approach. Transfer knowledge is more suited to categorize patterns in medical pictures since the number of available medical images is limited. This paper illustrates a convolutional neural network (CNN) and recurrent neural network (RNN) hybrid architecture for the diagnosis of COVID-19 from chest X-rays. The deep transfer methods used were VGG19, DenseNet121
... Show MoreThis manuscript presents several applications for solving special kinds of ordinary and partial differential equations using iteration methods such as Adomian decomposition method (ADM), Variation iterative method (VIM) and Taylor series method. These methods can be applied as well as to solve nonperturbed problems and 3rd order parabolic PDEs with variable coefficient. Moreover, we compare the results using ADM, VIM and Taylor series method. These methods are a commination of the two initial conditions.
Interface evaluation has been the subject of extensive study and research in human-computer interaction (HCI). It is a crucial tool for promoting the idea that user engagement with computers should resemble casual conversations and interactions between individuals, according to specialists in the field. Researchers in the HCI field initially focused on making various computer interfaces more usable, thus improving the user experience. This study's objectives were to evaluate and enhance the user interface of the University of Baghdad's implementation of an online academic management system using the effectiveness, time-based efficiency, and satisfaction rates that comply with the task questionnaire process. We made a variety of interfaces f
... Show MoreOriginal Research Paper Mathematics 1-Introduction : In the light of the progress and rapid development of the applications of research in applications fields, the need to rely on scientific tools and cleaner for data processing has become a prominent role in the resolution of decisions in industrial and service institutions according to the real need of these methods to make them scientific methods to solve the problem Making decisions for the purpose of making the departments succeed in performing their planning and executive tasks. Therefore, we found it necessary to know the transport model in general and to use statistical methods to reach the optimal solution with the lowest possible costs in particular. And you know The Transportatio
... Show MoreString matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show More
Ground Penetrating Radar (GPR) is a nondestructive geophysical technique that uses electromagnetic waves to evaluate subsurface information. A GPR unit emits a short pulse of electromagnetic energy and is able to determine the presence or absence of a target by examining the reflected energy from that pulse. GPR is geophysical approach that use band of the radio spectrum. In this research the function of GPR has been summarized as survey different buried objects such as (Iron, Plastic(PVC), Aluminum) in specified depth about (0.5m) using antenna of 250 MHZ, the response of the each object can be recognized as its shapes, this recognition have been performed using image processi |
The calculation of the oil density is more complex due to a wide range of pressuresand temperatures, which are always determined by specific conditions, pressure andtemperature. Therefore, the calculations that depend on oil components are moreaccurate and easier in finding such kind of requirements. The analyses of twenty liveoil samples are utilized. The three parameters Peng Robinson equation of state istuned to get match between measured and calculated oil viscosity. The Lohrenz-Bray-Clark (LBC) viscosity calculation technique is adopted to calculate the viscosity of oilfrom the given composition, pressure and temperature for 20 samples. The tunedequation of state is used to generate oil viscosity values for a range of temperatu
... Show More