Preferred Language
Articles
/
joe-442
Data Aggregation in Wireless Sensor Networks Using Modified Voronoi Fuzzy Clustering Algorithm
...Show More Authors

Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which lead to extend the network lifetime and reduce the traffic that may be accrue in the buffer of sink node. Each cluster head collected data from its members and forwards it to the sink node. A comparative study between modified VFCA and LEACH protocol is implemented in this paper and shows that the modified VFCA is more efficient than LEACH protocol in terms of network lifetime and average energy consumption. Another comparative study between modified VFCA and K-Means clustering algorithm is presented and shows that the modified VFCA is more efficient than K-Means clustering algorithm in terms of  packets transmitted to sink node, buffer utilization, packet loss values and running time. A simulation process is developed and tested using Matlab R2010a program in a computer having the following properties: windows 7 (32-bit operating system), core i7, RAM 4GB, hard 1TB.

 

 

View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Some NONPARAMETRIC ESTIMATORS FOR RIGHT CENSORED SURVIVAL DATA
...Show More Authors

The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.

For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Facial Emotion Images Recognition Based On Binarized Genetic Algorithm-Random Forest
...Show More Authors

Most recognition system of human facial emotions are assessed solely on accuracy, even if other performance criteria are also thought to be important in the evaluation process such as sensitivity, precision, F-measure, and G-mean. Moreover, the most common problem that must be resolved in face emotion recognition systems is the feature extraction methods, which is comparable to traditional manual feature extraction methods. This traditional method is not able to extract features efficiently. In other words, there are redundant amount of features which are considered not significant, which affect the classification performance. In this work, a new system to recognize human facial emotions from images is proposed. The HOG (Histograms of Or

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (7)
Scopus Crossref
Publication Date
Sat Jul 31 2021
Journal Name
Iraqi Journal Of Science
A Decision Tree-Aware Genetic Algorithm for Botnet Detection
...Show More Authors

     In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets  namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from

... Show More
Scopus (6)
Crossref (2)
Scopus Crossref
Publication Date
Tue Nov 01 2016
Journal Name
2016 International Conference On Advances In Electrical, Electronic And Systems Engineering (icaees)
Efficient routing algorithm for VANETs based on distance factor
...Show More Authors

There has been a great deal of research into the considerable challenge of managing of traffic at road junctions; its application to vehicular ad hoc network (VANET) has proved to be of great interest in the developed world. Dynamic topology is one of the vital challenges facing VANET; as a result, routing of packets to their destination successfully and efficiently is a non-simplistic undertaking. This paper presents a MDORA, an efficient and uncomplicated algorithm enabling intelligent wireless vehicular communications. MDORA is a robust routing algorithm that facilitates reliable routing through communication between vehicles. As a position-based routing technique, the MDORA algorithm, vehicles' precise locations are used to establish th

... Show More
View Publication
Scopus (8)
Crossref (7)
Scopus Crossref
Publication Date
Thu Apr 25 2019
Journal Name
Engineering And Technology Journal
Improvement of Harris Algorithm Based on Gaussian Scale Space
...Show More Authors

Features is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.

View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Oct 01 2013
Journal Name
2013 Ieee International Conference On Systems, Man, And Cybernetics
AWSS: An Algorithm for Measuring Arabic Word Semantic Similarity
...Show More Authors

View Publication
Scopus (24)
Crossref (12)
Scopus Clarivate Crossref
Publication Date
Sun Jan 01 2012
Journal Name
International Journal Of Cyber-security And Digital Forensics (ijcsdf)
Genetic Algorithm Approach for Risk Reduction of Information Security
...Show More Authors

Nowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef

... Show More
Publication Date
Thu Aug 31 2023
Journal Name
Journal Of Kufa For Mathematics And Computer
Hiding a Secret Message Encrypted by S-DES Algorithm
...Show More Authors

Nowadays, it is quite usual to transmit data through the internet, making safe online communication essential and transmitting data over internet channels requires maintaining its confidentiality and ensuring the integrity of the transmitted data from unauthorized individuals. The two most common techniques for supplying security are cryptography and steganography. Data is converted from a readable format into an unreadable one using cryptography. Steganography is the technique of hiding sensitive information in digital media including image, audio, and video. In our proposed system, both encryption and hiding techniques will be utilized. This study presents encryption using the S-DES algorithm, which generates a new key in each cyc

... Show More
View Publication
Crossref
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
A Cryptosystem for Database Security Based on TSFS Algorithm
...Show More Authors

Implementation of TSFS (Transposition, Substitution, Folding, and Shifting) algorithm as an encryption algorithm in database security had limitations in character set and the number of keys used. The proposed cryptosystem is based on making some enhancements on the phases of TSFS encryption algorithm by computing the determinant of the keys matrices which affects the implementation of the algorithm phases. These changes showed high security to the database against different types of security attacks by achieving both goals of confusion and diffusion.

View Publication Preview PDF
Scopus (7)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Aip Conference Proceedings
Developing a lightweight cryptographic algorithm based on DNA computing
...Show More Authors

This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of

... Show More
Crossref (7)
Crossref