Computer systems and networks are being used in almost every aspect of our daily life, the security threats to computers and networks have increased significantly. Usually, password-based user authentication is used to authenticate the legitimate user. However, this method has many gaps such as password sharing, brute force attack, dictionary attack and guessing. Keystroke dynamics is one of the famous and inexpensive behavioral biometric technologies, which authenticate a user based on the analysis of his/her typing rhythm. In this way, intrusion becomes more difficult because the password as well as the typing speed must match with the correct keystroke patterns. This thesis considers static keystroke dynamics as a transparent layer of the user for user authentication. Back Propagation Neural Network (BPNN) and the Probabilistic Neural Network (PNN) are used as a classifier to discriminate between the authentic and impostor users. Furthermore, four keystroke dynamics features namely: Dwell Time (DT), Flight Time (FT), Up-Up Time (UUT), and a mixture of (DT) and (FT) are extracted to verify whether the users could be properly authenticated. Two datasets (keystroke-1) and (keystroke-2) are used to show the applicability of the proposed Keystroke dynamics user authentication system. The best results obtained with lowest false rates and highest accuracy when using UUT compared with DT and FT features and comparable to combination of DT and FT, because of UUT as one direct feature that implicitly contained the two other features DT, and FT; that lead to build a new feature from the previous two features making the last feature having more capability to discriminate the authentic users from the impostors. In addition, authentication with UUT alone instead of the combination of DT and FT reduce the complexity and computational time of the neural network when compared with combination of DT and FT features.
Recognizing speech emotions is an important subject in pattern recognition. This work is about studying the effect of extracting the minimum possible number of features on the speech emotion recognition (SER) system. In this paper, three experiments performed to reach the best way that gives good accuracy. The first one extracting only three features: zero crossing rate (ZCR), mean, and standard deviation (SD) from emotional speech samples, the second one extracting only the first 12 Mel frequency cepstral coefficient (MFCC) features, and the last experiment applying feature fusion between the mentioned features. In all experiments, the features are classified using five types of classification techniques, which are the Random Forest (RF),
... Show MoreIraq has a huge network of pipelines, transport crude oil and final hydrocarbon products as well as portable water. These networks are exposed to extensive damage due to the underground corrosion processes unless suitable protection techniques are used. In this paper we collect the information of cathodic protection for pipeline in practical fields (Oil Group in Al Doura), to obtain data base to understand and optimize the design which is made by simulation for the environmental factors and cathodic protection variables also soil resistivity using wenner four terminal methods for survey sites; and soil pH investigations were recorded for these selected fields were within 7-8, and recording the anodes voltage and its related currents for
... Show MoreThe proliferation of cellular network enabled users through various positioning tools to track locations, location information is being continuously captured from mobile phones, created a prototype that enables detected location based on using the two invariant models for Global Systems for Mobile (GSM) and Universal Mobile Telecommunications System (UMTS). The smartphone application on an Android platform applies the location sensing run as a background process and the localization method is based on cell phones. The proposed application is associated with remote server and used to track a smartphone without permissions and internet. Mobile stored data location information in the database (SQLite), then transfer it into location AP
... Show MoreUsed automobile oils were subjected to filtration to remove solid material and dehydration to remove water, gasoline and light components by using vacuum distillation under moderate pressure, and then the dehydrated waste oil is subjected to extraction by using liquid solvents. Two solvents, namely n-butanol and n-hexane were used to extract base oil from automobile used oil, so that the expensive base oil can be reused again.
The recovered base oil by using n-butanol solvent gives (88.67%) reduction in carbon residue, (75.93%) reduction in ash content, (93.73%) oil recovery, (95%) solvent recovery and (100.62) viscosity index, at (5:1) solvent to used oil ratio and (40 oC) extraction temperature, while using n-hexane solvent gives (6
Investigating the human mobility patterns is a highly interesting field in the 21th century, and it takes vast attention from multi-disciplinary scientists in physics, economic, social, computer, engineering…etc. depending on the concept that relates between human mobility patterns and their communications. Hence, the necessity for a rich repository of data has emerged. Therefore, the most powerful solution is the usage of GSM network data, which gives millions of Call Details Records gained from urban regions. However, the available data still have shortcomings, because it gives only the indication of spatio-temporal data at only the moment of mobile communication activities. In th
In this paper we deal with the problem of ciphering and useful from group isomorphism for construct public key cipher system, Where construction 1-EL- Gamal Algorithm. 2- key- exchange Algorithm
In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreBackground: This study aimed to determine the gender of a sample of Iraqi adults using the mesio-distal width of mandibular canines, inter-canine width and standard mandibular canine index, and to determine the percentage of dimorphism as an aid in forensic dentistry. Materials and methods: The sample included 200 sets of study models belong to 200 subjects (100 males and 100 females) with an age ranged between 17-23 years. The mesio-distal crown dimension was measured manually, from the contact points for the mandibular canines (both sides), in addition to the inter-canine width using digital vernier. Descriptive statistics were obtained for the measurements for both genders; paired sample t-test was used to evaluate the side difference of
... Show MoreIn this paper, visible image watermarking algorithm based on biorthogonal wavelet
transform is proposed. The watermark (logo) of type binary image can be embedded in the
host gray image by using coefficients bands of the transformed host image by biorthogonal
transform domain. The logo image can be embedded in the top-left corner or spread over the
whole host image. A scaling value (α) in the frequency domain is introduced to control the
perception of the watermarked image. Experimental results show that this watermark
algorithm gives visible logo with and no losses in the recovery process of the original image,
the calculated PSNR values support that. Good robustness against attempt to remove the
watermark was s
In this paper we find the exact solution of Burger's equation after reducing it to Bernoulli equation. We compare this solution with that given by Kaya where he used Adomian decomposition method, the solution given by chakrone where he used the Variation iteration method (VIM)and the solution given by Eq(5)in the paper of M. Javidi. We notice that our solution is better than their solutions.