Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
The normalized difference vegetation index (NDVI) is an effective graphical indicator that can be used to analyze remote sensing measurements using a space platform, in order to investigate the trend of the live green vegetation in the observed target. In this research, the change detection of vegetation in Babylon city was done by tracing the NDVI factor for temporal Landsat satellite images. These images were used and utilized in two different terms: in March 19th in 2015 and March 5th in 2020. The Arc-GIS program ver. 10.7 was adopted to analyze the collected data. The final results indicate a spatial variation in the (NDVI), where it increases from (1666.91 𝑘𝑚2) in 2015 to (1697.01 𝑘𝑚2)) in 2020 between the t
... Show More<span lang="EN-US">The need for robotics systems has become an urgent necessity in various fields, especially in video surveillance and live broadcasting systems. The main goal of this work is to design and implement a rover robotic monitoring system based on raspberry pi 4 model B to control this overall system and display a live video by using a webcam (USB camera) as well as using you only look once algorithm-version five (YOLOv5) to detect, recognize and display objects in real-time. This deep learning algorithm is highly accurate and fast and is implemented by Python, OpenCV, PyTorch codes and the Context Object Detection Task (COCO) 2020 dataset. This robot can move in all directions and in different places especially in
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
Atomic Force Microscope is an efficient tool to study the topography of precipitate. A study using Continuous Flow Injection via the use of Ayah 6SX1-T-2D Solar cell CFI Analyser . It was found that Cyproheptadine –HCl form precipitates of different quality using a precipitating agent's potassium hexacyanoferrate (III) and sodium nitroprusside. The formed precipitates are collected as they are formed in the usual sequence of forming the precipitate via the continuous flow .The precipitates are collected and dried under normal atmospheric pressure. The precipitates are subjected to atomic force microscope scanning to study the variation and differences of these precipitates relating them to the kind of response to both precipitates give
... Show MoreA simple indirect spectrophotometric method for determination of mebendazol in pure and pharmaceutical formulation was presented in this study. UV-Visible spectrophotometry using the optimal conditions was developed for determination of mebendazole in pure drug and different preparation samples. The method is based on the oxidation of drug by nbromosuccinimide with hydrochloric acid and the left amount of oxidizing agent was determined by the reaction with tartarazine and the absorbance was measured at 428 nm. Calibration curves were linear in the range of 5 to 30 µg.mL-1 with molar absorptivity 8437.2 L.mol-1 .cm-1 . The limits of detection and quantification were determined and found to be 0.7770 µg.mL-1 and 2.3400 µg.mL-1 respec
... Show MoreThe field of Optical Character Recognition (OCR) is the process of converting an image of text into a machine-readable text format. The classification of Arabic manuscripts in general is part of this field. In recent years, the processing of Arabian image databases by deep learning architectures has experienced a remarkable development. However, this remains insufficient to satisfy the enormous wealth of Arabic manuscripts. In this research, a deep learning architecture is used to address the issue of classifying Arabic letters written by hand. The method based on a convolutional neural network (CNN) architecture as a self-extractor and classifier. Considering the nature of the dataset images (binary images), the contours of the alphabet
... Show MoreRecently, gallbladder stones have been contained bile salt saturated a proximal 70 % cholesterol. This led us to investigate how can use transformer Streptococcus salivarius with plasmid pMG36bsh to fragment cholesterol of gallstones in vitro. Total mRNA of S. salivarius was produced using easy-spinTM, total RNA extraction kit and PCR cDNA-RT to observe the change after percent pMG36bsh vector and prepare S. salivarius have two copies from bsh genes (cgh, bsh) to fragment gallstone in bacterial culture. Our data shows increase bacterial bsh expression help to reduce gallstones concentration in culture when bile salt presented as stimulating agent for the association bsh genes were 77% compare with wild type has the reducing concentration ra
... Show More