With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Leveraging sophisticated AI algorithms, the study focuses on scrutinizingsubtle periodic patterns and uncovering relationships among the collected datasets. Through thiscomprehensive analysis, the research endeavors to pinpoint crime hotspots, detect fluctuations infrequency, and identify underlying causes of criminal activities. Furthermore, the research evaluates theefficacy of the AI model in generating productive insights and providing the most accurate predictionsof future criminal trends. These predictive insights are poised to revolutionize the strategies of lawenforcement agencies, enabling them to adopt proactive and targeted approaches. Emphasizing ethicalconsiderations, this research ensures the continued feasibility of AI use while safeguarding individuals'constitutional rights, including privacy. The anticipated outcomes of this research are anticipated tofurnish actionable intelligence for law enforcement, policymakers, and urban planners, aiding in theidentification of effective crime prevention strategies. By harnessing the potential of AI, this researchcontributes to the promotion of proactive strategies and data-driven models in crime analysis andprediction, offering a promising avenue for enhancing public security in Los Angeles and othermetropolitan areas.
Measuring the efficiency of postgraduate and undergraduate programs is one of the essential elements in educational process. In this study, colleges of Baghdad University and data for the academic year (2011-2012) have been chosen to measure the relative efficiencies of postgraduate and undergraduate programs in terms of their inputs and outputs. A relevant method to conduct the analysis of this data is Data Envelopment Analysis (DEA). The effect of academic staff to the number of enrolled and alumni students to the postgraduate and undergraduate programs are the main focus of the study.
New microphotometer was constructed in our Laboratory Which deals with the determination of Molybdenum (VI) through its Catalysis effect on Hydrogen peroxide and potasum iodide Reaction in acid medium H2SO4 0.01 mM. Linearity of 97.3% for the range 5- 100 ppm. The repeatability of result was better than 0.8 % 0.5 ppm was obtanined as L.U. (The method applied for the determination of Molybdenum (VI) in medicinal Sample (centrum). The determination was compared well with the developed method the conventional method.
In this paper, various aspects of smart grids are described. These aspects include the components of smart grids, the detailed functions of the smart energy meters within the smart grids and their effects on increasing the awareness, the advantages and disadvantages of smart grids, and the requirements of utilizing smart grids. To put some light on the difference between smart grids and traditional utility grids, some aspects of the traditional utility grids are covered in this paper as well.
Progression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreHeavy oil is classified as unconventional oil resource because of its difficulty to recover in its natural state, difficulties in transport and difficulties in marketing it. Upgrading solution to the heavy oil has positive impact technically and economically specially when it will be a competitive with conventional oils from the marketing prospective. Developing Qaiyarah heavy oil field was neglected in the last five decades, the main reason was due to the low quality of the crude oil resulted in the high viscosity and density of the crude oil in the field which was and still a major challenge putting them on the major stream line of production in Iraq. The low quality of the crude properties led to lower oil prices in the global markets
... Show MorePrediction of accurate values of residual entropy (SR) is necessary step for the
calculation of the entropy. In this paper, different equations of state were tested for the
available 2791 experimental data points of 20 pure superheated vapor compounds (14
pure nonpolar compounds + 6 pure polar compounds). The Average Absolute
Deviation (AAD) for SR of 2791 experimental data points of the all 20 pure
compounds (nonpolar and polar) when using equations of Lee-Kesler, Peng-
Robinson, Virial truncated to second and to third terms, and Soave-Redlich-Kwong
were 4.0591, 4.5849, 4.9686, 5.0350, and 4.3084 J/mol.K respectively. It was found
from these results that the Lee-Kesler equation was the best (more accurate) one
Permeability estimation is a vital step in reservoir engineering due to its effect on reservoir's characterization, planning for perforations, and economic efficiency of the reservoirs. The core and well-logging data are the main sources of permeability measuring and calculating respectively. There are multiple methods to predict permeability such as classic, empirical, and geostatistical methods. In this research, two statistical approaches have been applied and compared for permeability prediction: Multiple Linear Regression and Random Forest, given the (M) reservoir interval in the (BH) Oil Field in the northern part of Iraq. The dataset was separated into two subsets: Training and Testing in order to cross-validate the accuracy
... Show MoreThe purpose of this study was to find out the connection between the water parameters that were examined in the laboratory and the water index acquired from the examination of the satellite image of the study area. This was accomplished by analysing the Landsat-8 satellite picture results as well as the geographic information system (GIS). The primary goal of this study is to develop a model for the chemical and physical characteristics of the Al-Abbasia River in Al-Najaf Al-Ashraf Governorate. The water parameters employed in this investigation are as follows: (PH, EC, TDS, TSS, Na, Mg, K, SO4, Cl, and NO3). To collect the samples, ten sampling locations were identified, and the satellite image was obtained on the
... Show MoreIn this research, the performance of a two kind of membrane was examined to recovering the nutrients (protein and lactose) from the whey produced by the soft cheese industry in the General Company for Food Products inAbo-ghraab.Wheyare treated in two stages, the first including press whey into micron filter made of poly vinylidene difluoride (PVDF) standard plate type 800 kilo dalton, The membrane separates the whey to permeate which represent is the main nutrients and to remove the fat and microorganisms.The second stage is to isolate the protein by using ultra filter made of polyethylsulphone(PES)type plate with a measurement of 10,60 kilo dalton and the recovery of lactose in the form of permeate.
The results showed that the percen
Background: Excision repair cross-complementing group 2 gene (ERCC2) polymorphisms have been linked as being a risk factor for colorectal cancer (CRC) emergence. However, data from several studies are contradictory. To validate genetic biomarkers of the CRC; the impact of the following ERCC2 polymorphism (rs1799793 and rs238406) was examined on CRC susceptibility among sample of Iraqi population. Methods: A total of 126 subjects were enrolled in this case control study; 78 CRC patients and 48 apparently healthy individuals who are age, gender, smoking status and BMI matched. Polymerase chain reaction (PCR) was used for genotyping, followed by sequencing then the association between genetic polymorphisms and CRC risk was investigate
... Show More