This research investigates the pre- and post-cracking resistance of steel fiber-reinforced concrete specimens with Glass Fiber Reinforced Polymer (GFRP) bars subjected to flexural loading. The purpose is to modify the ductility and cracking resistance of GFRP-reinforced beams, which are prone to early cracking and excessive deflections instigated by the low modulus of elasticity of GFRP. Six self-compacting concrete specimens (1500×240×200 mm), incorporating steel fibers of two lengths (25 mm and 40 mm) with varying distribution depths, were tested to assess their structural performance. The results indicate significant enhancements in cracking resistance, stiffness, energy absorption, ductility, and flexural strength. Tested beams reinforced with 40 mm-long steel fibers exhibited a 23.9%–24.2% development in the ultimate moment capacity associated with the steel-reinforced specimens, whereas those with 25 mm fibers showed smaller increases (2.7%–3.1%). The cracking resistance improved by up to 33.3% in beams with 40 mm-long fibers and by 16.67%–20% in those with 25 mm-long fibers, associated with a non-fibrous GFRP specimen. Additionally, the inclusion of 40 mm hooked-end steel fibers significantly enhanced ultimate deflection, with peak deflections increasing by 30.2%–44.8% compared to steel-reinforced beams. Fibrous GFRP-reinforced beams exhibited up to 154% higher energy absorption under ultimate load than a non-fibrous GFRP beam. All fibrous GFRP-reinforced beams achieved deformation-based ductility indices between 4.2 and 6.9, exceeding the minimum threshold of 4 for adequate deformability. These findings confirm that incorporating 40 mm steel fibers significantly improves the structural behavior of GFRP-reinforced concrete specimens, offering valuable insights for optimizing their design.
The efficiency evaluation of the railway lines performance is done through a set of indicators and criteria, the most important are transport density, the productivity of enrollee, passenger vehicle production, the productivity of freight wagon, and the productivity of locomotives. This study includes an attempt to calculate the most important of these indicators which transport density index from productivity during the four indicators, using artificial neural network technology. Two neural networks software are used in this study, (Simulnet) and (Neuframe), the results of second program has been adopted. Training results and test to the neural network data used in the study, which are obtained from the international in
... Show MoreObjective: To assess the impact of a social support for pregnant women upon their pregnancy outcome Methodology: A descriptive purposive study was used to assess the impact of a social support on their pregnancy outcomes. The study was conducted from (22 \ September \ 2020 to 15 \ February \ 2021). A non-probability sample (purposive sample) was selected from 100 women. Data were collected through an interview with the mother in the counseling clinic, during the third trimester of pregnancy, as well as after childbirth in the labor wards to assess the outcome of pregnancy. Data were analyzed through descriptive statistics (frequency and percentages). Results: The most important thing observed in this study was the positive pregnancy outcome
... Show MoreThe reconciliation of tax reconciliation is one of the legal methods used by the financial authority in Iraq, which is done with the taxpayer
The research dealt with the weakness of tax revenues for many reasons, including tax evasion, which led to the search for ways to reduce evasion to increase the tax revenue, and settlement reconciliation one of these means .
The research proceeded from the premise that the use of a more broadly settled settlement would govern the tax evasion of taxpayers.
The researchers used a series of studies and previous research, books and other sources related to the subject of research, and this was done through the theoretical framework, and the practical aspect that included the fin
... Show MoreAlbizia lebbeck biomass was used as an adsorbent material in the present study to remove methyl red dye from an aqueous solution. A central composite rotatable design model was used to predict the dye removal efficiency. The optimization was accomplished under a temperature and mixing control system (37?C) with different particle size of 300 and 600 ?m. Highest adsorption efficiencies were obtained at lower dye concentrations and lower weight of adsorbent. The adsorption time, more than 48 h, was found to have a negative effect on the removal efficiency due to secondary metabolites compounds. However, the adsorption time was found to have a positive effect at high dye concentrations and high adsorbent weight. The colour removal effi
... Show MoreSolid‐waste management, particularly of aluminum (Al), is a challenge that is being confronted around the world. Therefore, it is valuable to explore methods that can minimize the exploitation of natural assets, such as recycling. In this study, using hazardous Al waste as the main electrodes in the electrocoagulation (EC) process for dye removal from wastewater was discussed. The EC process is considered to be one of the most efficient, promising, and cost‐effective ways of handling various toxic effluents. The effect of current density (10, 20, and 30 mA/cm2), electrolyte concentration (1 and 2 g/L), and initial concentration of Brilliant Blue dye (15 and 30 mg/L) on
When optimizing the performance of neural network-based chatbots, determining the optimizer is one of the most important aspects. Optimizers primarily control the adjustment of model parameters such as weight and bias to minimize a loss function during training. Adaptive optimizers such as ADAM have become a standard choice and are widely used for their invariant parameter updates' magnitudes concerning gradient scale variations, but often pose generalization problems. Alternatively, Stochastic Gradient Descent (SGD) with Momentum and the extension of ADAM, the ADAMW, offers several advantages. This study aims to compare and examine the effects of these optimizers on the chatbot CST dataset. The effectiveness of each optimizer is evaluat
... Show MoreThis paper assesses the impact of changes and fluctuations in bank deposits on the money supply in Iraq. Employing the research constructs an Error Correction Model (ECM) using monthly time series data from 2010 to 2015. The analysis begins with the Phillips-Perron unit root test to ascertain the stationarity of the time series and the Engle and Granger cointegration test to examine the existence of a long-term relationship. Nonparametric regression functions are estimated using two methods: Smoothing Spline and M-smoothing. The results indicate that the M-smoothing approach is the most effective, achieving the shortest adjustment period and the highest adjustment ratio for short-term disturbances, thereby facilitating a return
... Show MoreIn this study, the first kind Bessel function was used to solve Kepler equation for an elliptical orbiting satellite. It is a classical method that gives a direct solution for calculation of the eccentric anomaly. It was solved for one period from (M=0-360)° with an eccentricity of (e=0-1) and the number of terms from (N=1-10). Also, the error in the representation of the first kind Bessel function was calculated. The results indicated that for eccentricity of (0.1-0.4) and (N = 1-10), the values of eccentric anomaly gave a good result as compared with the exact solution. Besides, the obtained eccentric anomaly values were unaffected by increasing the number of terms (N = 6-10) for eccentricities (0.8 and 0.9). The Bessel
... Show More