Computer systems and networks are being used in almost every aspect of our daily life, the security threats to computers and networks have increased significantly. Usually, password-based user authentication is used to authenticate the legitimate user. However, this method has many gaps such as password sharing, brute force attack, dictionary attack and guessing. Keystroke dynamics is one of the famous and inexpensive behavioral biometric technologies, which authenticate a user based on the analysis of his/her typing rhythm. In this way, intrusion becomes more difficult because the password as well as the typing speed must match with the correct keystroke patterns. This thesis considers static keystroke dynamics as a transparent layer of the user for user authentication. Back Propagation Neural Network (BPNN) and the Probabilistic Neural Network (PNN) are used as a classifier to discriminate between the authentic and impostor users. Furthermore, four keystroke dynamics features namely: Dwell Time (DT), Flight Time (FT), Up-Up Time (UUT), and a mixture of (DT) and (FT) are extracted to verify whether the users could be properly authenticated. Two datasets (keystroke-1) and (keystroke-2) are used to show the applicability of the proposed Keystroke dynamics user authentication system. The best results obtained with lowest false rates and highest accuracy when using UUT compared with DT and FT features and comparable to combination of DT and FT, because of UUT as one direct feature that implicitly contained the two other features DT, and FT; that lead to build a new feature from the previous two features making the last feature having more capability to discriminate the authentic users from the impostors. In addition, authentication with UUT alone instead of the combination of DT and FT reduce the complexity and computational time of the neural network when compared with combination of DT and FT features.
The basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreTelevision white spaces (TVWSs) refer to the unused part of the spectrum under the very high frequency (VHF) and ultra-high frequency (UHF) bands. TVWS are frequencies under licenced primary users (PUs) that are not being used and are available for secondary users (SUs). There are several ways of implementing TVWS in communications, one of which is the use of TVWS database (TVWSDB). The primary purpose of TVWSDB is to protect PUs from interference with SUs. There are several geolocation databases available for this purpose. However, it is unclear if those databases have the prediction feature that gives TVWSDB the capability of decreasing the number of inquiries from SUs. With this in mind, the authors present a reinforcement learning-ba
... Show MoreImage recognition is one of the most important applications of information processing, in this paper; a comparison between 3-level techniques based image recognition has been achieved, using discrete wavelet (DWT) and stationary wavelet transforms (SWT), stationary-stationary-stationary (sss), stationary-stationary-wavelet (ssw), stationary-wavelet-stationary (sws), stationary-wavelet-wavelet (sww), wavelet-stationary- stationary (wss), wavelet-stationary-wavelet (wsw), wavelet-wavelet-stationary (wws) and wavelet-wavelet-wavelet (www). A comparison between these techniques has been implemented. according to the peak signal to noise ratio (PSNR), root mean square error (RMSE), compression ratio (CR) and the coding noise e (n) of each third
... Show MoreAdvances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship an
... Show MoreThe agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreThe research aimed at designing a teaching aid for learning backswing into handstand as well as identifying its effect on learning skill performance. The researchers hypothesized statistical differences between pre and post-tests in favor of the research group. They used the experimental method on six (13 – 16) year–old Baghdad club gymnasts. The researchers used the one group design in which all players perform pretests followed by special tests on the teaching aid than are tested posttests. The researchers conclude that the teaching aid positively affected learning the skill as well as the teaching aid was very good and endured the performance of all gymnasts. The researcher recommended making simi
... Show Morehe Orthogonal Frequency Division Multiplexing is a promising technology for the Next Generation Networks. This technique was selected because of the flexibility for the various parameters, high spectral efficiency, and immunity to ISI. The OFDM technique suffers from significant digital signal processing, especially inside the Inverse/ Fast Fourier Transform IFFT/FFT. This part is used to perform the orthogonality/De-orthogonality between the subcarriers which the important part of the OFDM system. Therefore, it is important to understand the parameter effects on the increase or to decrease the FPGA power consumption for the IFFT/FFT. This thesis is focusing on the FPGA power consumption of the IFFT/FFT uses in the OFDM system. This researc
... Show More