Prediction of daily rainfall is important for flood forecasting, reservoir operation, and many other hydrological applications. The artificial intelligence (AI) algorithm is generally used for stochastic forecasting rainfall which is not capable to simulate unseen extreme rainfall events which become common due to climate change. A new model is developed in this study for prediction of daily rainfall for different lead times based on sea level pressure (SLP) which is physically related to rainfall on land and thus able to predict unseen rainfall events. Daily rainfall of east coast of Peninsular Malaysia (PM) was predicted using SLP data over the climate domain. Five advanced AI algorithms such as extreme learning machine (ELM), Bayesian regularized neural networks (BRNNs), Bayesian additive regression trees (BART), extreme gradient boosting (xgBoost), and hybrid neural fuzzy inference system (HNFIS) were used considering the complex relationship of rainfall with sea level pressure. Principle components of SLP domain correlated with daily rainfall were used as predictors. The results revealed that the efficacy of AI models is predicting daily rainfall one day before. The relative performance of the models revealed the higher performance of BRNN with normalized root mean square error (NRMSE) of 0.678 compared with HNFIS (NRMSE = 0.708), BART (NRMSE = 0.784), xgBoost (NRMSE = 0.803), and ELM (NRMSE = 0.915). Visual inspection of predicted rainfall during model validation using density-scatter plot and other novel ways of visual comparison revealed the ability of BRNN to predict daily rainfall one day before reliably.
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreIt has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlighting the diff
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MorePotential data interpretation is significant for subsurface structure characterization. The current study is an attempt to explore the magnetic low lying between Najaf and Diwaniyah Cities, In central Iraq. It aims to understand the subsurface structures that may result from this anomaly and submit a better subsurface structural image of the region. The study area is situated in the transition zone, known as the Abu Jir Fault Zone. This tectonic boundary is an inherited basement weak zone extending towards the NW-SE direction. Gravity and magnetic data processing and enhancement techniques; Total Horizontal Gradient, Tilt Angle, Fast Sigmoid Edge Detection, Improved Logistic, and Theta Map filters highlight source boundaries and the
... Show Moreملخص البحث
تبحث الدراسھ عن تنفیذ افضل لمفھوم التعلم مدى الحیاة كھیكل موجھ للسیاسة التربویة في العراق بشكل عام وفي
التعلیم العالي بشكل خاص. تحدد الدراسة استراتجیات التعلم مدى الحیاة وتناقش اھمیتھ وسماتھ الرئیسیة لتسھیل
الوصول الى فرص تعلم متمیز و ملائم لحاجات الطلبة مدى الحیاة، كما تناقش دور الجامعة في تحقیق ھذا الھدف.
The optimum conditions for production of fibrinolytic protease from an edible mushroom Pleurotus ostreatus grown on the solid medium , Sus medium, composed of Sus wastes (produced from extracted medicinal plant Glycyrrhiza glabra) were determined. Addition of 5% of Soya bean seeds meal in Sus medium recorded a maximum fibrinolytic protease activity resulting in 7.7 units / ml. The optimum moisture content of Sus medium supplemented with 5% Soya bean seeds meal was 60% resulting in 7.2 units / ml.Pleurotus ostreatus produced a maximum fibrinolytic protease activity when the spawn rate,pH of medium and incubation temperature were 2,6 and 30°C, respectively. The maximum fibrinolytic protease activity was 7.6 units / ml when incubat
... Show MoreAbstract
The current research aims to examine the effect of the Adi and Shayer model on the achievement of fifth-grade students and their attitudes toward history. To achieve the research objective, the researcher has adopted two null hypotheses. 1) there is no statistically significant difference at the level of (0.05) between the average score of students of the experimental group who study the history of Europe and modern American history according to the model of Addie and Shayer, and the average scores of the students of the control group who study the same subjects according to the traditional method in the test of post-achievement. 2) There was no statistically significant difference at the level (
... Show MoreAlbizia lebbeck biomass was used as an adsorbent material in the present study to remove methyl red dye from an aqueous solution. A central composite rotatable design model was used to predict the dye removal efficiency. The optimization was accomplished under a temperature and mixing control system (37?C) with different particle size of 300 and 600 ?m. Highest adsorption efficiencies were obtained at lower dye concentrations and lower weight of adsorbent. The adsorption time, more than 48 h, was found to have a negative effect on the removal efficiency due to secondary metabolites compounds. However, the adsorption time was found to have a positive effect at high dye concentrations and high adsorbent weight. The colour removal effi
... Show More