Offline handwritten signature is a type of behavioral biometric-based on an image. Its problem is the accuracy of the verification because once an individual signs, he/she seldom signs the same signature. This is referred to as intra-user variability. This research aims to improve the recognition accuracy of the offline signature. The proposed method is presented by using both signature length normalization and histogram orientation gradient (HOG) for the reason of accuracy improving. In terms of verification, a deep-learning technique using a convolution neural network (CNN) is exploited for building the reference model for a future prediction. Experiments are conducted by utilizing 4,000 genuine as well as 2,000 skilled forged signature samples collected from 200 individuals. This database is publicly distributed under the name of SIGMA for Malaysian individuals. The experimental results are reported as both error forms, namely False Accept Rate (FAR) and False Reject Rate (FRR), which achieved up to 4.15% and 1.65% respectively. The overall successful accuracy is up to 97.1%. A comparison is also made that the proposed methodology outperforms the state-of-the-art works that are using the same SIGMA database.
In this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreIn this paper, 3D simulation of the global coronal magnetic field, which use observed line of sight component of the photosphere magnetic field from (MDI/SOHO) was carried out using potential field model. The obtained results, improved the theoretical models of the coronal magnetic field, which represent a suitable lower boundary conditions (Bx, By, Bz) at the base of the linear force-free and nonlinear force free models, provides a less computationally expensive method than other models. Generally, very high speed computer and special configuration is needed to solve such problem as well as the problem of viewing the streamline of the magnetic field. For high accuracy special mathematical treatment was adopted to solve the computation comp
... Show MoreBackground: Sprite coding is a very effective technique for clarifying the background video object. The sprite generation is an open issue because of the foreground objects which prevent the precision of camera motion estimation and blurs the created sprite. Objective: In this paper, a quick and basic static method for sprite area detection in video data is presented. Two statistical methods are applied; the mean and standard deviation of every pixel (over all group of video frame) to determine whether the pixel is a piece of the selected static sprite range or not. A binary map array is built for demonstrating the allocated sprite (as 1) while the non-sprite (as 0) pixels valued. Likewise, holes and gaps filling strategy was utilized to re
... Show MoreThis research is concerned with a new type of ferrocement characterized by its lower density and enhanced thermal insulation. Lightweight ferrocement plates have many advantages, low weight, low cost, thermal insulation, environmental conservation. This work contain two group experimental : first different of layer ferrocement, second different of ratio aggregate to cement. The experiments were made to determined the optimum proportion of cement and lightweight aggregate (recycle thermestone ). A low W/C ratio of 0.4 was used with super plasticizer conforming to ASTM 494 Type G. The compressive strength of the mortar mixes is 20-25 MPa. The work also involved the determination of thermal properties .Thermal conductivity value of thi
... Show MoreThis work was conducted to study the extraction of eucalyptus oil from natural plants (Eucalyptus camaldulensis leaves) using water distillation method by Clevenger apparatus. The effects of main operating parameters were studied: time to reach equilibrium, temperature (70 to100°C), solvent to solid ratio (4:1 to 8:1 (v/w)), agitation speed (0 to 900 rpm), and particle size (0.5 to 2.5 cm) of the fresh leaves, to find the best processing conditions for achieving maximum oil yield. The results showed that the agitation speed of 900 rpm, temperature 100° C, with solvent to solid ratio 5:1 (v/w) of particle size 0.5 cm for 160 minute give the highest percentage of oil (46.25 wt.%). The extracted oil was examined by HPLC.
Buildings such as malls, offices, airports and hospitals nowadays have become very complicated which increases the need for a solution that helps people to find their locations in these buildings. GPS or cell signals are commonly used for positioning in an outdoor environment and are not accurate in indoor environment. Smartphones are becoming a common presence in our daily life, also the existing infrastructure, the Wi-Fi access points, which is commonly available in most buildings, has motivated this work to build hybrid mechanism that combines the APs fingerprint together with smartphone barometer sensor readings, to accurately determine the user position inside building floor relative to well-known lan
... Show MoreThis paper shows an approach for Electromyography (ECG) signal processing based on linear and nonlinear adaptive filtering using Recursive Least Square (RLS) algorithm to remove two kinds of noise that affected the ECG signal. These are the High Frequency Noise (HFN) and Low Frequency Noise (LFN). Simulation is performed in Matlab. The ECG, HFN and LFN signals used in this study were downloaded from ftp://ftp.ieee.org/uploads/press/rangayyan/, and then the filtering process was obtained by using adaptive finite impulse response (FIR) that illustrated better results than infinite impulse response (IIR) filters did.
This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show More