Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user authentication based on keystroke dynamic in order to avoid un authorized user access to the system. Naive Bayes Classifier (NBC) is applied for keystroke authentication using unigraph and diagraph keystroke features. The unigraph Dwell Time (DT), diagraph Down-Down Time (DDT) features, and combination of (DT and DDT) are used. The results show that the combination of features (DT and DDT) produces better results with low error rate as compared with using DT or DDT alone.
Regression Discontinuity (RD) means a study that exposes a definite group to the effect of a treatment. The uniqueness of this design lies in classifying the study population into two groups based on a specific threshold limit or regression point, and this point is determined in advance according to the terms of the study and its requirements. Thus , thinking was focused on finding a solution to the issue of workers retirement and trying to propose a scenario to attract the idea of granting an end-of-service reward to fill the gap ( discontinuity point) if it had not been granted. The regression discontinuity method has been used to study and to estimate the effect of the end -service reward on the cutoff of insured workers as well as t
... Show MoreAbstract: Two different shapes of offset optical fiber was studied based on coreless fiber for refractive index (RI)/concentration (con.) measurement, and compare them. These shapes are U and S-shapes, both shapes structures were formed by one segment of coreless fiber (CF) was joined between two single mode (SMF) lead in /lead out with the same displacement (12.268µm) at both sides, the results shows the high sensitive was achieved in a novel S-shape equal 98.768nm/RIU, to our knowledge, no one has ever mentioned or experienced it, it’s the best shape rather than the U-shape which equal 85.628nm/RIU. In this research, it was proved that the offset form has a significant effect on the sensitivity of the sensor. Addi
... Show MoreThe aim of this research is to construct a cognitive behavior program based on the theory of Meichenbaum in reducing the emotional sensitivity among Intermediate school students. To achieve the aims of the research, two hypotheses were formulated and the experimental design with equal groups was chosen. The population of research and its sample are determined. The test of negative emotional sensitivity, which is constructed by the researcher, was adopted. The test contains (20) items proved validity and reliability as a reliable test by presenting it to a group of arbitrators and experts in education and psychology. An educational program is constructed based on the theory of Meichenbaum. The test was applied to a sample of (60) second i
... Show MoreA frequently used approach for denoising is the shrinkage of coefficients of the noisy signal representation in a transform domain. This paper proposes an algorithm based on hybrid transform (stationary wavelet transform proceeding by slantlet transform); The slantlet transform is applied to the approximation subband of the stationary wavelet transform. BlockShrink thresholding technique is applied to the hybrid transform coefficients. This technique can decide the optimal block size and thresholding for every wavelet subband by risk estimate (SURE). The proposed algorithm was executed by using MATLAB R2010aminimizing Stein’s unbiased with natural images contaminated by white Gaussian noise. Numerical results show that our algorithm co
... Show More