The investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group of signatures, numbering 70 images, were used. Image preprocessing steps were performed on them, and their features were extracted using the median filter. After that, the eigenvector and eigenvalue were calculated using the PCA algorithm. Then the backpropagation neural network algorithm was applied for training and testing where the performance reached 6.7995e−07 for 82 epochs and the accuracy was 99.98%.
Dam operation and management have become more complex recently because of the need for considering hydraulic structure sustainability and environmental protect on. An Earthfill dam that includes a powerhouse system is considered as a significant multipurpose hydraulic structure. Understanding the effects of running hydropower plant turbines on the dam body is one of the major safety concerns for earthfill dams. In this research, dynamic analysis of earthfill dam, integrated with a hydropower plant system containing six vertical Kaplan turbines (i.e., Haditha dam), is investigated. In the first stage of the study, ANSYS-CFX was used to represent one vertical Kaplan turbine unit by designing a three-dimensional (3-D) finite element (F
... Show MoreIntrusion detection systems (IDS) are useful tools that help security administrators in the developing task to secure the network and alert in any possible harmful event. IDS can be classified either as misuse or anomaly, depending on the detection methodology. Where Misuse IDS can recognize the known attack based on their signatures, the main disadvantage of these systems is that they cannot detect new attacks. At the same time, the anomaly IDS depends on normal behaviour, where the main advantage of this system is its ability to discover new attacks. On the other hand, the main drawback of anomaly IDS is high false alarm rate results. Therefore, a hybrid IDS is a combination of misuse and anomaly and acts as a solution to overcome the dis
... Show MoreThe study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreIn this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t
... Show MoreIn this paper, we present multiple bit error correction coding scheme based on extended Hamming product code combined with type II HARQ using shared resources for on chip interconnect. The shared resources reduce the hardware complexity of the encoder and decoder compared to the existing three stages iterative decoding method for on chip interconnects. The proposed method of decoding achieves 20% and 28% reduction in area and power consumption respectively, with only small increase in decoder delay compared to the existing three stage iterative decoding scheme for multiple bit error correction. The proposed code also achieves excellent improvement in residual flit error rate and up to 58% of total power consumption compared to the other err
... Show More