Background/Objectives: The purpose of current research aims to a modified image representation framework for Content-Based Image Retrieval (CBIR) through gray scale input image, Zernike Moments (ZMs) properties, Local Binary Pattern (LBP), Y Color Space, Slantlet Transform (SLT), and Discrete Wavelet Transform (DWT). Methods/Statistical analysis: This study surveyed and analysed three standard datasets WANG V1.0, WANG V2.0, and Caltech 101. The features an image of objects in this sets that belong to 101 classes-with approximately 40-800 images for every category. The suggested infrastructure within the study seeks to present a description and operationalization of the CBIR system through automated attribute extraction system premised on CNN infrastructure. Findings: The results acquired through the investigated CBIR system alongside the benchmarked results have clearly indicated that the suggested technique had the best performance with the overall accuracy at 88.29% as opposed to the other sets of data adopted in the experiments. The outstanding results indicate clearly that the suggested method was effective for all the sets of data. Improvements/Applications: As a result of this study, it was found the revealed that the multiple image representation was redundant for extraction accuracy, and the findings from the study indicated that automatically retrieved features are capable and reliable in generating accurate outcomes.
Multi-point forming (MPF) is an advanced flexible manufacture technology, and the technology results from the idea that the whole die is separated into small punches that can be adjusted height. This idea is applied to the traditional rigid blank-holder, so flexible blank-holder (FBH) idea can be obtained. In this work, the performance of a multi-point die is investigated with pins in square matrix and suitable blank holder. Each pin in the punch holder can be a significant moved according to the die high and at different load that applied with spring with respect to spring stiffness. The results shows the reduction in setting time with respect to traditional single point incremental forming process that lead to (90%). and also show duri
... Show MoreBotnet is a malicious activity that tries to disrupt traffic of service in a server or network and causes great harm to the network. In modern years, Botnets became one of the threads that constantly evolving. IDS (intrusion detection system) is one type of solutions used to detect anomalies of networks and played an increasing role in the computer security and information systems. It follows different events in computer to decide to occur an intrusion or not, and it used to build a strategic decision for security purposes. The current paper
Simulation of the Linguistic Fuzzy Trust Model (LFTM) over oscillating Wireless Sensor Networks (WSNs) where the goodness of the servers belonging to them could change along the time is presented in this paper, and the comparison between the outcomes achieved with LFTM model over oscillating WSNs with the outcomes obtained by applying the model over static WSNs where the servers maintaining always the same goodness, in terms of the selection percentage of trustworthy servers (the accuracy of the model) and the average path length are also presented here. Also in this paper the comparison between the LFTM and the Bio-inspired Trust and Reputation Model for Wireless Sensor Network
... Show More<p>Energy and memory limitations are considerable constraints of sensor nodes in wireless sensor networks (WSNs). The limited energy supplied to network nodes causes WSNs to face crucial functional limitations. Therefore, the problem of limited energy resource on sensor nodes can only be addressed by using them efficiently. In this research work, an energy-balancing routing scheme for in-network data aggregation is presented. This scheme is referred to as Energy-aware and load-Balancing Routing scheme for Data Aggregation (hereinafter referred to as EBR-DA). The EBRDA aims to provide an energy efficient multiple-hop routing to the destination on the basis of the quality of the links between the source and destination. In
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
The useful of remote sensing techniques in Environmental Engineering and another science is to save time, Coast and efforts, also to collect more accurate information under monitoring mechanism. In this research a number of statistical models were used for determining the best relationships between each water quality parameter and the mean reflectance values generated for different channels of radiometer operate simulated to the thematic Mappar satellite image. Among these models are the regression models which enable us to as certain and utilize a relation between a variable of interest. Called a dependent variable; and one or more independent variables
In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreThe objective of this work is to design and implement a cryptography system that enables the sender to send message through any channel (even if this channel is insecure) and the receiver to decrypt the received message without allowing any intruder to break the system and extracting the secret information. In this work, we implement an interaction between the feedforward neural network and the stream cipher, so the secret message will be encrypted by unsupervised neural network method in addition to the first encryption process which is performed by the stream cipher method. The security of any cipher system depends on the security of the related keys (that are used by the encryption and the decryption processes) and their corresponding le
... Show More