Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained analytical graphs are discussed thoroughly, and apparently, the proposed CFSS algorithm outperformed another existing algorithm with a 10.47% improvement in average response time for multiple jobs per round.
The current research aims at: - Identifying the role played by the leadership in empowerment and organizational learning abilities and their reflection on the knowledge capital, and the extent to which these concepts can be applied effectively at Wasit University. The problem of research .... In a series of questions: The most important is that the dimensions leadership empowerment and distance learning organizational capacity correlation relationship and impact and significant statistical significance with the capital knowledge.
To understand the nature of the relationship and the impact between the variables, leadership was adopted by empowerment as the fir
... Show MorePurpose – The main purpose of this research is to highlight the main role of strategic leadership skills for top managements in accessing to effective management in accordance with the (VUCA Prime) methodology in (VUCA) environment as Miniature virtual environment, which refers to (Volatility), (Uncertainty), (Complexity), and (Ambiguity).
methodology – To achieve the research objective, this study selected the quantitative approach in research design, Questionnaire was used as the main instrument for data collection, the sample comprised the opinion poll (106) individual who functions as a head department. (Structural equation modelling by (Smart Pls3)
... Show MoreThe way used to estimate the fuzzy reliability differs according to the nature of the information of failure time which has been dealt in this research.The information of failure times has no probable distribution to explain it , in addition it has fuzzy quality.The research includes fuzzy reliability estimation of three periods ,the first one from 1986 to 2013,the second one from 2013 to 2033 while the third one from 2033 to 2066 .Four failure time have been chosen to identify the membership function of fuzzy trapezoid represented in the pervious years after taking in consideration the estimation of most researchers, proffional geologists and the technician who is incharge of maintaining of Mosul Dam project. B
... Show Moreالملخص : يهذف البحث التعرف على اثر آنموذج التعلم الخبراتي (لروبين) في مادة الفيزياء والدافعية الإبداعية لدى طلاب المرطة الإعدادية, وذلك بالتحقق من الفرضية الآتية: • لا يوجد فروق ذات دلالة إحصائية عند مستوى (0.05) بین متوسط درجات المجموعة التجريبية التـي درست وفق إستراتيجية التعلم الخبراتي (لروبين) ومتوسط درجات المجموعة الضابطة التي درست وفق الطريقة الاعتيادية في مقیاس الدافعية الابداعية. استخدم الباحثون التص
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreData hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show More