A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).
Automatic recognition of individuals is very important in modern eras. Biometric techniques have emerged as an answer to the matter of automatic individual recognition. This paper tends to give a technique to detect pupil which is a mixture of easy morphological operations and Hough Transform (HT) is presented in this paper. The circular area of the eye and pupil is divided by the morphological filter as well as the Hough Transform (HT) where the local Iris area has been converted into a rectangular block for the purpose of calculating inconsistencies in the image. This method is implemented and tested on the Chinese Academy of Sciences (CASIA V4) iris image database 249 person and the IIT Delhi (IITD) iris
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreResearch indicates that the second half of the twentieth century marked large interests in the service industry by government and private organizations in that one, and the service industry has become the bedrock of plans in achieving economic and social development. From this standpoint felt specialists and researchers the importance of transport modes, including rail, which should be available between Almnltq Civil populated as services organized by the competent authorities to achieve the active participation of citizens in economic and social development in the region and that the term services means economic activities, which are the results Pollack concrete such as accepting the situation and satisfaction them or satisfacti
... Show MoreSoftware Defined Network (SDN) is a new technology that separate the control plane from the data plane. SDN provides a choice in automation and programmability faster than traditional network. It supports the Quality of Service (QoS) for video surveillance application. One of most significant issues in video surveillance is how to find the best path for routing the packets between the source (IP cameras) and destination (monitoring center). The video surveillance system requires fast transmission and reliable delivery and high QoS. To improve the QoS and to achieve the optimal path, the SDN architecture is used in this paper. In addition, different routing algorithms are used with different steps. First, we eva
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreConfocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and e
... Show MoreThe main intention of this study was to investigate the development of a new optimization technique based on the differential evolution (DE) algorithm, for the purpose of linear frequency modulation radar signal de-noising. As the standard DE algorithm is a fixed length optimizer, it is not suitable for solving signal de-noising problems that call for variability. A modified crossover scheme called rand-length crossover was designed to fit the proposed variable-length DE, and the new DE algorithm is referred to as the random variable-length crossover differential evolution (rvlx-DE) algorithm. The measurement results demonstrate a highly efficient capability for target detection in terms of frequency response and peak forming that was isola
... Show MoreRegarding to the computer system security, the intrusion detection systems are fundamental components for discriminating attacks at the early stage. They monitor and analyze network traffics, looking for abnormal behaviors or attack signatures to detect intrusions in early time. However, many challenges arise while developing flexible and efficient network intrusion detection system (NIDS) for unforeseen attacks with high detection rate. In this paper, deep neural network (DNN) approach was proposed for anomaly detection NIDS. Dropout is the regularized technique used with DNN model to reduce the overfitting. The experimental results applied on NSL_KDD dataset. SoftMax output layer has been used with cross entropy loss funct
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show More