In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification methods, the results indicate that MLP was better than otherswith precision 81% , it took the maximum execution time for processing of the data-sets.
Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreThis paper adapted the neural network for the estimating of the direction of arrival (DOA). It uses an unsupervised adaptive neural network with GHA algorithm to extract the principal components that in turn, are used by Capon method to estimate the DOA, where by the PCA neural network we take signal subspace only and use it in Capon (i.e. we will ignore the noise subspace, and take the signal subspace only).
The crude enzyme Nattokinase produced by Bacillus subtilis was used in ripening cheddar cheese by adding three concentration of enzyme 80, 160 and 320mg/Kg beside the control treatment without enzyme, the product was checked for three months to determine humidity, protein, fat, non-protein nitrogen, soluble nitrogen and pH, sensory evaluation was conducted, it was noticed that the variety in protein percentages and the soluble nitrogen percentage during second month of ripening for T2, T3 and T4 treatments were (11.2, 15.54 and 18.48) respectively, in comparison with control which was 7.6%, while in the third month it was (17.37, 20.67 and 22.26) respectively, in comparison with control which was only 10%, on the other hand, non-protein
... Show More
Abstract
This study aims to identify the degree to which the first cycle teachers use different feedback patterns in the e-learning system in addition to the differences in the degree of use according to specialization, teaching experience, and in-service training in the field of classroom assessment, as well as the interaction between them. The study sample consisted of (350) female teachers of the first cycle in government schools in Muscat Governorate for the academic year 2020/2021. The study used a questionnaire that contained four different patterns of feedback, which are reinforcement, informative, corrective, and interpretive feedback. The psychometric properties of the que
... Show MoreHuman posture estimation is a crucial topic in the computer vision field and has become a hotspot for research in many human behaviors related work. Human pose estimation can be understood as the human key point recognition and connection problem. The paper presents an optimized symmetric spatial transformation network designed to connect with single-person pose estimation network to propose high-quality human target frames from inaccurate human bounding boxes, and introduces parametric pose non-maximal suppression to eliminate redundant pose estimation, and applies an elimination rule to eliminate similar pose to obtain unique human pose estimation results. The exploratory outcomes demonstrate the way that the proposed technique can pre
... Show MoreDigital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreIn the present study twenty samples of human urine were taken
from healthy male and female with different of: ages, occupation and
place of residence. These samples were collected from the hospital to
measure the concentration of radon gas in human urine by using one
of solid state nuclear track detectors LR-115.
The results obtained of the concentrations of radon in healthy human
urine are varying from 2.12×10-3 Bq.l-1 to 4.42×10-3 Bq.l-1 and
these values are less than the allowed limits 12.3×10-3 Bq.l-1.
The most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.