This article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while the other eight deep beams were with openings in shear spans and with carbon fiber–reinforced polymer sheet strengthening around opening zones. The opening size was adopted to be 200 × 200 mm dimensions in eight deep beams, while it was considered to be 230 × 230 mm dimensions in the other eight specimens. In eight specimens the opening was located at the center of the shear span, while in the other eight beams the opening was attached to the interior edge of the shear span. Carbon fiber–reinforced polymer sheets were installed around openings to compensate for the cutout area of concrete. Results gained from the experimental test showed that the creation of openings in shear spans affect the load-carrying capacity, where the reduction of the failure load for specimens with the opening but without strengthening may attain 66% compared to deep beams without openings. On the other hand, the strengthening by carbon fiber–reinforced polymer sheets for beams with openings increased the failure load by 20%–47% compared with the identical deep beam without strengthening. A significant contribution of carbon fiber–reinforced polymer sheets in restricting the deformability of deep beams was observed.
The Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreThe process of risk assessment in the build-operate transfer (BOT) project is very important to identify and analyze the risks in order to make the appropriate decision to respond to them. In this paper, AHP Technique was used to make the appropriate decision regarding response to the most prominent risks that were generated in BOT projects, which includes a comparison between the criteria for each risk as well as the available alternatives and by mathematical methods using matrices to reach an appropriate decision to respond to each risk.Ten common risks in BOT contracts are adopted for analysis in this paper, which is grouped into six main risk headings.The procedures followed in this paper are the questionnaire method
... Show MoreGroupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Introduction: Although soap industry is known from hundreds of years, the development accompanied with this industry was little. The development implied the mechanical equipment and the additive materials necessary to produce soap with the best specifications of shape, physical and chemical properties. Objectives: This research studies the use of vacuum reactive distillation VRD technique for soap production. Methods: Olein and Palmitin in the ratio of 3 to 1 were mixed in a flask with NaOH solution in stoichiometric amount under different vacuum pressures from -0.35 to -0.5 bar. Total conversion was reached by using the VRD technique. The soap produced by the VRD method was compared with soap prepared by the reaction - only method which
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreNurse scheduling problem is one of combinatorial optimization problems and it is one of NP-Hard problems which is difficult to be solved as optimal solution. In this paper, we had created an proposed algorithm which it is hybrid simulated annealing algorithm to solve nurse scheduling problem, developed the simulated annealing algorithm and Genetic algorithm. We can note that the proposed algorithm (Hybrid simulated Annealing Algorithm(GS-h)) is the best method among other methods which it is used in this paper because it satisfied minimum average of the total cost and maximum number of Solved , Best and Optimal problems. So we can note that the ratios of the optimal solution are 77% for the proposed algorithm(GS-h), 28.75% for Si
... Show MoreFerritin is a key organizer of protected deregulation, particularly below risky hyperferritinemia, by straight immune-suppressive and pro-inflammatory things. , We conclude that there is a significant association between levels of ferritin and the harshness of COVID-19. In this paper we introduce a semi- parametric method for prediction by making a combination between NN and regression models. So, two methodologies are adopted, Neural Network (NN) and regression model in design the model; the data were collected from مستشفى دار التمريض الخاص for period 11/7/2021- 23/7/2021, we have 100 person, With COVID 12 Female & 38 Male out of 50, while 26 Female & 24 Male non COVID out of 50. The input variables of the NN m
... Show MoreThe application of ultrafiltration (UF) and nanofiltration (NF) processes in the handling of raw produced water have been investigated in the present study. Experiments of both ultrafiltration and nanofiltration processes are performed in a laboratory unit, which is operated in a cross-flow pattern. Various types of hollow fiber membranes were utilized in this study such as poly vinyl chloride (PVC) UF membrane, two different polyether sulfone (PES) NF membranes, and poly phenyl sulfone PPSU NF membrane. It was found that the turbidity of the treated water is higher than 95 % by using UF and NF membranes. The chemical oxygen demand COD (160 mg/l) and Oil content (26.8 mg/l) were found after treatment according to the allowable limits set
... Show MoreIn this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show More