Background/Objectives: The purpose of current research aims to a modified image representation framework for Content-Based Image Retrieval (CBIR) through gray scale input image, Zernike Moments (ZMs) properties, Local Binary Pattern (LBP), Y Color Space, Slantlet Transform (SLT), and Discrete Wavelet Transform (DWT). Methods/Statistical analysis: This study surveyed and analysed three standard datasets WANG V1.0, WANG V2.0, and Caltech 101. The features an image of objects in this sets that belong to 101 classes-with approximately 40-800 images for every category. The suggested infrastructure within the study seeks to present a description and operationalization of the CBIR system through automated attribute extraction system premised on CNN infrastructure. Findings: The results acquired through the investigated CBIR system alongside the benchmarked results have clearly indicated that the suggested technique had the best performance with the overall accuracy at 88.29% as opposed to the other sets of data adopted in the experiments. The outstanding results indicate clearly that the suggested method was effective for all the sets of data. Improvements/Applications: As a result of this study, it was found the revealed that the multiple image representation was redundant for extraction accuracy, and the findings from the study indicated that automatically retrieved features are capable and reliable in generating accurate outcomes.
Simulation of the Linguistic Fuzzy Trust Model (LFTM) over oscillating Wireless Sensor Networks (WSNs) where the goodness of the servers belonging to them could change along the time is presented in this paper, and the comparison between the outcomes achieved with LFTM model over oscillating WSNs with the outcomes obtained by applying the model over static WSNs where the servers maintaining always the same goodness, in terms of the selection percentage of trustworthy servers (the accuracy of the model) and the average path length are also presented here. Also in this paper the comparison between the LFTM and the Bio-inspired Trust and Reputation Model for Wireless Sensor Network
... Show More<p>Energy and memory limitations are considerable constraints of sensor nodes in wireless sensor networks (WSNs). The limited energy supplied to network nodes causes WSNs to face crucial functional limitations. Therefore, the problem of limited energy resource on sensor nodes can only be addressed by using them efficiently. In this research work, an energy-balancing routing scheme for in-network data aggregation is presented. This scheme is referred to as Energy-aware and load-Balancing Routing scheme for Data Aggregation (hereinafter referred to as EBR-DA). The EBRDA aims to provide an energy efficient multiple-hop routing to the destination on the basis of the quality of the links between the source and destination. In
... Show MorePurpose: To use the L25 Taguchi orthogonal array for optimizing the three main solvothermal parameters that affect the synthesis of metal-organic frameworks-5 (MOF-5). Methods: The L25 Taguchi methodology was used to study various parameters that affect the degree of crystallinity (DOC) of MOF-5. The parameters comprised temperature of synthesis, duration of synthesis, and ratio of the solvent, N,N-dimethyl formamide (DMF) to reactants. For each parameter, the volume of DMF was varied while keeping the weight of reactants constant. The weights of 1,4-benzodicarboxylate (BDC) and Zn(NO3)2.6H2O used were 0.390 g and 2.166 g, respectively. For each parameter investigated, five different levels were used. The MOF-5 samples were synthesi
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Nowadays, the mobile communication networks have become a consistent part of our everyday life by transforming huge amount of data through communicating devices, that leads to new challenges. According to the Cisco Networking Index, more than 29.3 billion networked devices will be connected to the network during the year 2023. It is obvious that the existing infrastructures in current networks will not be able to support all the generated data due to the bandwidth limits, processing and transmission overhead. To cope with these issues, future mobile communication networks must achieve high requirements to reduce the amount of transferred data, decrease latency and computation costs. One of the essential challenging tasks in this subject
... Show More<span>As a result of numerous applications and low installation costs, wireless sensor networks (WSNs) have expanded excessively. The main concern in the WSN environment is to lower energy consumption amidst nodes while preserving an acceptable level of service quality. Using multi-mobile sinks to reduce the nodes' energy consumption have been considered as an efficient strategy. In such networks, the dynamic network topology created by the sinks mobility makes it a challenging task to deliver the data to the sinks. Thus, in order to provide efficient data dissemination, the sensor nodes will have to readjust the routes to the current position of the mobile sinks. The route re-adjustment process could result in a significant m
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform