Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained analytical graphs are discussed thoroughly, and apparently, the proposed CFSS algorithm outperformed another existing algorithm with a 10.47% improvement in average response time for multiple jobs per round.
Simulated annealing (SA) has been an effective means that can address difficulties related to optimization problems. is now a common discipline for research with several productive applications such as production planning. Due to the fact that aggregate production planning (APP) is one of the most considerable problems in production planning, in this paper, we present multi-objective linear programming model for APP and optimized by . During the course of optimizing for the APP problem, it uncovered that the capability of was inadequate and its performance was substandard, particularly for a sizable controlled problem with many decision variables and plenty of constraints. Since this algorithm works sequentially then the current state wi
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThis work describes the development of new spectrophotometric techniques for 3-aminophenol assessment. The first technique involves using benzidine in an alkaline solution to convert 3-aminophenol into a colored complex. The produced complex has a red color with an absorbance of 462 nm. Between the concentration range 5–14 μg mL−1, Beer's law is obeyed with a correlation coefficient (R2) of 0.99781, a limit of detection (LOD) of 0.0423 μg mL−1, and a limit of quantification (LOQ) of 0.1411 μg mL−1. The recovery was between 87.2–95.43%, the relative standard deviation (%RSD) was 2.40–3.31% and the molar absorptivity was 3.545 × 103 L mol−1 cm−1. Secondly, cloud point extraction (CPE) was used to determ
... Show MoreSurface electromyography (sEMG) and accelerometer (Acc) signals play crucial roles in controlling prosthetic and upper limb orthotic devices, as well as in assessing electrical muscle activity for various biomedical engineering and rehabilitation applications. In this study, an advanced discrimination system is proposed for the identification of seven distinct shoulder girdle motions, aimed at improving prosthesis control. Feature extraction from Time-Dependent Power Spectrum Descriptors (TDPSD) is employed to enhance motion recognition. Subsequently, the Spectral Regression (SR) method is utilized to reduce the dimensionality of the extracted features. A comparative analysis is conducted between the Linear Discriminant Analysis (LDA) class
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show Moreتحتل الرقابة المالية والإدارية أهمية كبيرة في عمل المؤسسات وتمثل المرحلة الإشرافية فيها وتُعد من أهم عناصر العملية الإدارية، ويأتي دور الأجهزة الرقابية لضمان تقديم الخدمات للمواطنين، بأسرع وقت، وبأقل جهد وتكلفة. واعتمدت الدولة العراقية منذ تأسيسها على جهاز رقابي خارجي واحد هو ديوان مراقب الحسابات العام المؤسس بالقانون رقم (17) لعام 1927 الذي كان يمارس أعمال الرقابة والتدقيق المالي والحسابي للمصروفات والإير
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show More