The efficiency of the Honeywords approach has been proven to be a significant tool for boosting password security. The suggested system utilizes the Meerkat Clan Algorithm (MCA) in conjunction with WordNet to produce honeywords, thereby enhancing the level of password security. The technique of generating honeywords involves data sources from WordNet, which contributes to the improvement of authenticity and diversity in the honeywords. The method encompasses a series of consecutive stages, which include the tokenization of passwords, the formation of alphabet tokens using the Meerkat Clan Algorithm (MCA), the handling of digit tokens, the creation of unique character tokens, and the consolidation of honeywords. The optimization of the performance of the Meerkat Clan Algorithm (MCA) involves the careful selection of parameters. The experimental findings have exhibited noteworthy levels of precision and optimum efficacy, particularly in tasks such as proposing words with similar meanings, forecasting numerical values, and producing distinctive symbols. The attainment of this achievement is facilitated by a confluence of factors, encompassing the caliber of data, the judicious use of algorithms or models, and the ongoing process of iterative improvement to consistently enhance outcomes. In order to achieve the appropriate levels of accuracy and functionality, it is crucial to engage in the process of conducting experiments, thoroughly testing the system, and making necessary improvements. The empirical findings provide confirmation of the effectiveness of the MCA in producing a varied and protected collection of honeywords. This is especially evident in the case of alphabet tokens, which are distinguished by their autonomous creation and strong security characteristics. The analysis of correction rates, specifically in relation to the password "Lion1999*," demonstrates the aforementioned results. This study reveals an average accuracy of honeyword production up to 0.729847632111541. In the same manner, the accuracy of the password "house2000" is determined to be 0.761325846711256. Additionally, when considering a sample of 100 passwords, the mean accuracy of honeyword creation is calculated to be 0.7073897168887518. The findings collectively highlight the effectiveness of the MCA in generating honeywords that possess improved security characteristics.
High frequency (HF) communications have an important role in long distances wireless communications. This frequency band is more important than VHF and UHF, as HF frequencies can cut longer distance with a single hopping. It has a low operation cost because it offers over-the-horizon communications without repeaters, therefore it can be used as a backup for satellite communications in emergency conditions. One of the main problems in HF communications is the prediction of the propagation direction and the frequency of optimum transmission (FOT) that must be used at a certain time. This paper introduces a new technique based on Oblique Ionosonde Station (OIS) to overcome this problem with a low cost and an easier way. This technique uses the
... Show MoreIn this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
Producing pseudo-random numbers (PRN) with high performance is one of the important issues that attract many researchers today. This paper suggests pseudo-random number generator models that integrate Hopfield Neural Network (HNN) with fuzzy logic system to improve the randomness of the Hopfield Pseudo-random generator. The fuzzy logic system has been introduced to control the update of HNN parameters. The proposed model is compared with three state-ofthe-art baselines the results analysis using National Institute of Standards and Technology (NIST) statistical test and ENT test shows that the projected model is statistically significant in comparison to the baselines and this demonstrates the competency of neuro-fuzzy based model to produce
... Show MoreComputer systems and networks are being used in almost every aspect of our daily life; as a result the security threats to computers and networks have also increased significantly. Traditionally, password-based user authentication is widely used to authenticate legitimate user in the current system0T but0T this method has many loop holes such as password sharing, shoulder surfing, brute force attack, dictionary attack, guessing, phishing and many more. The aim of this paper is to enhance the password authentication method by presenting a keystroke dynamics with back propagation neural network as a transparent layer of user authentication. Keystroke Dynamics is one of the famous and inexpensive behavioral biometric technologies, which identi
... Show MoreIn this research a proposed technique is used to enhance the frame difference technique performance for extracting moving objects in video file. One of the most effective factors in performance dropping is noise existence, which may cause incorrect moving objects identification. Therefore it was necessary to find a way to diminish this noise effect. Traditional Average and Median spatial filters can be used to handle such situations. But here in this work the focus is on utilizing spectral domain through using Fourier and Wavelet transformations in order to decrease this noise effect. Experiments and statistical features (Entropy, Standard deviation) proved that these transformations can stand to overcome such problems in an elegant way.
... Show MoreA Multiple System Biometric System Based on ECG Data
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.