Preferred Language
Articles
/
bsj-9735
Optimizing Blockchain Consensus: Incorporating Trust Value in the Practical Byzantine Fault Tolerance Algorithm with Boneh-Lynn-Shacham Aggregate Signature
...Show More Authors

The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen from among them based on which node has the highest trust value, it transforms the BLS signature process into the information interaction process between nodes. Consequently, communication complexity is reduced, and node-to-node information exchange remains secure. The simulation experiment findings demonstrate that the IBFT consensus method enhances transaction throughput rate by 61% and reduces latency by 13% when compared to the PBFT algorithm.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Mar 01 2023
Journal Name
Journal Of Engineering
Producing Load Bearing Block Using LECA as Partial Replacement of Coarse Aggregate
...Show More Authors

The ability to produce load-bearing masonry units adopting ACI 211.1 mix design using (1:3.2:2.5) as (cement: fine aggregate: coarse aggregate) with slump range (25-50mm) which can conform (dimension, absorption, and compressive strength) within IQS 1077/1987 requirements type A was our main goal of the study. The ability to use low cement content (300 kg/m3) to handle our market price products since the most consumption in wall construction for low-cost buildings was encouraging. The use of (10 and 20%) of LECA as partial volume replacement of coarse aggregate to reduce the huge weight of masonry blocks can also be recommended. The types of production of the load-bearing masonry units were A and B for (

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Thu Feb 01 2024
Journal Name
Journal Of Engineering
Rutting Prediction of Asphalt Mixtures Containing Treated and Untreated Recycled Concrete Aggregate
...Show More Authors

Rutting is a crucial element of the mechanical performance characteristics of asphalt mixtures, which was the primary target of this study. The task involved substituting various portions of virgin coarse aggregate with recycled concrete aggregate materials that had been treated or left untreated at rates ranging from 25 to 100%, with a constant increase of 25%. The treatment process of recycled concrete aggregate involved soaking in acetic acid, followed by a mechanical process for a short time inside a Los Angeles machine without the balls. This research utilized two primary tests: the standard Marshall test to identify the optimal asphalt contents and the volumetric characteristics of asphalt mixtures. The other one w

... Show More
View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Fri Jan 01 2016
Journal Name
Aip Conference Proceedings
Application of simulated annealing to solve multi-objectives for aggregate production planning
...Show More Authors

Aggregate production planning (APP) is one of the most significant and complicated problems in production planning and aim to set overall production levels for each product category to meet fluctuating or uncertain demand in future. and to set decision concerning hiring, firing, overtime, subcontract, carrying inventory level. In this paper, we present a simulated annealing (SA) for multi-objective linear programming to solve APP. SA is considered to be a good tool for imprecise optimization problems. The proposed model minimizes total production and workforce costs. In this study, the proposed SA is compared with particle swarm optimization (PSO). The results show that the proposed SA is effective in reducing total production costs and req

... Show More
View Publication Preview PDF
Scopus (14)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Fri Oct 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Estimate The Survival Function By Using The Genetic Algorithm
...Show More Authors

  Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Dec 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A comparative study of Gaussian mixture algorithm and K-means algorithm for efficient energy clustering in MWSN
...Show More Authors

Wireless sensor networks (WSNs) represent one of the key technologies in internet of things (IoTs) networks. Since WSNs have finite energy sources, there is ongoing research work to develop new strategies for minimizing power consumption or enhancing traditional techniques. In this paper, a novel Gaussian mixture models (GMMs) algorithm is proposed for mobile wireless sensor networks (MWSNs) for energy saving. Performance evaluation of the clustering process with the GMM algorithm shows a remarkable energy saving in the network of up to 92%. In addition, a comparison with another clustering strategy that uses the K-means algorithm has been made, and the developed method has outperformed K-means with superior performance, saving ener

... Show More
View Publication
Scopus (5)
Crossref (4)
Scopus Crossref
Publication Date
Sat Jan 01 2011
Journal Name
Software Engineering And Computer Systems
Practical Adoptions of T-Way Strategies for Interaction Testing
...Show More Authors

View Publication
Scopus (9)
Crossref (2)
Scopus Crossref
Publication Date
Sat Mar 20 2021
Journal Name
Agroforestry Systems
Hydraulic lift: processes, methods, and practical implications for society
...Show More Authors

View Publication
Scopus (21)
Crossref (20)
Scopus Clarivate Crossref
Publication Date
Mon Apr 03 2023
Journal Name
International Journal Of Online And Biomedical Engineering (ijoe)
An Integrated Grasshopper Optimization Algorithm with Artificial Neural Network for Trusted Nodes Classification Problem
...Show More Authors

Wireless Body Area Network (WBAN) is a tool that improves real-time patient health observation in hospitals, asylums, especially at home. WBAN has grown popularity in recent years due to its critical role and vast range of medical applications. Due to the sensitive nature of the patient information being transmitted through the WBAN network, security is of paramount importance. To guarantee the safe movement of data between sensor nodes and various WBAN networks, a high level of security is required in a WBAN network. This research introduces a novel technique named Integrated Grasshopper Optimization Algorithm with Artificial Neural Network (IGO-ANN) for distinguishing between trusted nodes in WBAN networks by means of a classifica

... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Wed Jun 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
Compared with Genetic Algorithm Fast – MCD – Nested Extension and Neural Network Multilayer Back propagation
...Show More Authors

The study using Nonparametric methods for roubust to estimate a location and scatter it is depending  minimum covariance determinant of multivariate regression model , due to the presence of outliear values and increase the sample size and presence of more than after the model regression multivariate therefore be difficult to find a median location .       

It has been the use of genetic algorithm Fast – MCD – Nested Extension and compared with neural Network Back Propagation of multilayer in terms of accuracy of the results and speed in finding median location ,while the best sample to be determined by relying on less distance (Mahalanobis distance)has the stu

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Thu Jun 20 2019
Journal Name
Baghdad Science Journal
Using Backpropagation to Predict Drought Factor in Keetch-Byram Drought Index
...Show More Authors

Forest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data

... Show More
View Publication Preview PDF
Clarivate Crossref