Eco-friendly concrete is produced using the waste of many industries. It reduces the fears concerning energy utilization, raw materials, and mass-produced cost of common concrete. Several stress-strain models documented in the literature can be utilized to estimate the ultimate strength of concrete components reinforced with fibers. Unfortunately, there is a lack of data on how non-metallic fibers, such as polypropylene (PP), affect the properties of concrete, especially eco-friendly concrete. This study presents a novel approach to modeling the stress-strain behavior of eco-friendly polypropylene fiber-reinforced concrete (PFRC) using meta-heuristic particle swarm optimization (PSO) employing 26 PFRC various mixtures. The cement was partially replaced by ground granulated blast furnace slag (GGBFS) with various amounts to make the concrete eco-friendly. The concrete was reinforced with several quantities of PP fiber. Specific cases of beams and cylinders made from PFRC were examined to learn more about their performance. The research contributes valuable insights to eco-friendly concrete design by integrating industrial byproducts (GGBFS) and non-metallic fibers, aligning with sustainable construction trends. The study demonstrates that adding sustainable fibers to concrete improves its structural integrity while lessening its environmental impact. Experimental testing validates the proposed model, showing a significant connection between the expected and actual stress-strain behavior. In terms of absolute relative error (ARE), the dataset proves that the suggested model has both the greatest (ARE 5 %) and worst (ARE > 15 %) frequencies. The proposed model demonstrates promising accuracy (R-value = 0.9975) and highlights the effectiveness of PSO in parameter optimization. Additionally, the usage of GGBFS instead of OPC resulted in CO2 reduction up to 42 %. Comparative analysis of the proposed model against existing models registered an excellent forecasted accuracy.
This paper describes the problem of online autonomous mobile robot path planning, which is consisted of finding optimal paths or trajectories for an autonomous mobile robot from a starting point to a destination across a flat map of a terrain, represented by a 2-D workspace. An enhanced algorithm for solving the problem of path planning using Bacterial Foraging Optimization algorithm is presented. This nature-inspired metaheuristic algorithm, which imitates the foraging behavior of E-coli bacteria, was used to find the optimal path from a starting point to a target point. The proposed algorithm was demonstrated by simulations in both static and dynamic different environments. A comparative study was evaluated between the developed algori
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreThe subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show More
