Curing of concrete is the maintenance of a satisfactory moisture content and temperature for a
period of time immediately following placing so the desired properties are developed. Accelerated
curing is advantages where early strength gain in concrete is important. The expose of concrete
specimens to the accelerated curing conditions which permit the specimens to develop a significant
portion of their ultimate strength within a period of time (1-2 days), depends on the method of the
curing cycle.Three accelerated curing test methods are adopted in this study. These are warm water,
autogenous and proposed test methods. The results of this study has shown good correlation
between the accelerated strength especially for the proposal curing test method and normal strength
using normal curing method at ages 7 and 28 day for the five different chemical composition of
cement with different water to cement ratios equal to 0.45, 0.55, 0.65 and 0.75. Linear and
nonlinear regression analysis show high correlation for the different types of the accelerated curing
methods with coefficient of correlation (R2) more than 0.9.
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreThis paper shews how to estimate the parameter of generalized exponential Rayleigh (GER) distribution by three estimation methods. The first one is maximum likelihood estimator method the second one is moment employing estimation method (MEM), the third one is rank set sampling estimator method (RSSEM)The simulation technique is used for all these estimation methods to find the parameters for generalized exponential Rayleigh distribution. Finally using the mean squares error criterion to compare between these estimation methods to find which of these methods are best to the others
AbstractThe research aims to identify the impact of the different methods in calculating the Items sensitivity coefficient on the standard characteristics of the Criterion-Referenced test in the measurement and evaluation material. The research sample consisted of (35) male and female students, who were chosen by the intentional method. The researcher prepared learning-teaching program in constructing the content of the measurement and evaluation material for non-specialized departments, prepared an achievement test in its equivalent forms, identified the results of agreement between the methods used in analyzing the items of the criterion-referenced test, and compared the standard characteristics of the achievement test, both according to
... Show MoreThe effect of the tensor term in the Skyrme interaction has been estimated in calculating the static and dynamic nuclear properties in sd and fp-shell model spaces nuclei. The nuclear shell gaps have been studied with different Skyrme parameterizations; Skxta and Skxtb with tensor interaction, SkX, SkM, and SLy4 without tensor interaction, and Skxcsb with consideration of the effect of charge symmetry breaking. We have examined the stability of N = 28 for 42Si and 48Ca. The results showed that the disappearance of the magicity occurs in the shell closure of 42Si. Furthermore, excitation energy, quadrupole deformation, neutron separation energy, pairing energy, and density profile have also been calculated. Quadrupole deformation indicates a
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreAn image retrieval system is a computer system for browsing, looking and recovering pictures from a huge database of advanced pictures. The objective of Content-Based Image Retrieval (CBIR) methods is essentially to extract, from large (image) databases, a specified number of images similar in visual and semantic content to a so-called query image. The researchers were developing a new mechanism to retrieval systems which is mainly based on two procedures. The first procedure relies on extract the statistical feature of both original, traditional image by using the histogram and statistical characteristics (mean, standard deviation). The second procedure relies on the T-
... Show MoreIn this research, the one of the most important model and widely used in many and applications is linear mixed model, which widely used to analysis the longitudinal data that characterized by the repeated measures form .where estimating linear mixed model by using two methods (parametric and nonparametric) and used to estimate the conditional mean and marginal mean in linear mixed model ,A comparison between number of models is made to get the best model that will represent the mean wind speed in Iraq.The application is concerned with 8 meteorological stations in Iraq that we selected randomly and then we take a monthly data about wind speed over ten years Then average it over each month in corresponding year, so we g
... Show More إن المقصود باختبارات حسن المطابقة هو التحقق من فرضية العدم القائمة على تطابق مشاهدات أية عينة تحت الدراسة لتوزيع احتمالي معين وترد مثل هكذا حالات في التطبيق العملي بكثرة وفي كافة المجالات وعلى الأخص بحوث علم الوراثة والبحوث الطبية والبحوث الحياتية ,عندما اقترح كلا من Shapiro والعالم Wilk عام 1965 اختبار حسن المطابقة الحدسي مع معالم القياس
(
Reinforcing asphalt concrete with polyester fibers considered as an active remedy to alleviate the harmful impact of fatigue deterioration. This study covers the investigation of utilizing two shapes of fibers size, 6.35 mm by 3.00 mm and 12.70 mm by 3.00 mm with mutual concentrations equal to 0.25 %, 0.50 % and 0.75 % by weight of mixture. Composition of asphalt mixture consists of different optimum (40-50) asphalt cement content, 12.50 mm nominal aggregate maximum size with limestone dust as a filler. Following the traditional asphalt cement and aggregate tests, three essential test were carried out on mixtures, namely: Marshall test (105 cylindrical specimens), indirect tensile strength test (21 cylindrical specimens)
... Show More