A hand gesture recognition system provides a robust and innovative solution to nonverbal communication through human–computer interaction. Deep learning models have excellent potential for usage in recognition applications. To overcome related issues, most previous studies have proposed new model architectures or have fine-tuned pre-trained models. Furthermore, these studies relied on one standard dataset for both training and testing. Thus, the accuracy of these studies is reasonable. Unlike these works, the current study investigates two deep learning models with intermediate layers to recognize static hand gesture images. Both models were tested on different datasets, adjusted to suit the dataset, and then trained under different methods. First, the models were initialized with random weights and trained from scratch. Afterward, the pre-trained models were examined as feature extractors. Finally, the pre-trained models were fine-tuned with intermediate layers. Fine-tuning was conducted on three levels: the fifth, fourth, and third blocks, respectively. The models were evaluated through recognition experiments using hand gesture images in the Arabic sign language acquired under different conditions. This study also provides a new hand gesture image dataset used in these experiments, plus two other datasets. The experimental results indicated that the proposed models can be used with intermediate layers to recognize hand gesture images. Furthermore, the analysis of the results showed that fine-tuning the fifth and fourth blocks of these two models achieved the best accuracy results. In particular, the testing accuracies on the three datasets were 96.51%, 72.65%, and 55.62% when fine-tuning the fourth block and 96.50%, 67.03%, and 61.09% when fine-tuning the fifth block for the first model. The testing accuracy for the second model showed approximately similar results.
Abstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreThe objectives of this research are to determine and find out the reality of crops structure of greenhouses in association of Al-Watan in order to stand on the optimal use of economic resources available for the purpose of reaching a crop structure optimization of the farm that achieves maximize profit and gross and net farm incomes , using the method of linear programming to choose the farm optimal plan with the highest net income , as well as identifying production plans farm efficient with (income - deviation) optimal (E-A) of the Association and derived, which takes into account the margin risk wich derived from each plan using the model( MOTAD), as a model of models of linear programming alternative programming m
... Show MoreObjective(s): To determine the impact of psychological distress in women upon coping with breast cancer.
Methodology: A descriptive design is carried throughout the present study. Convenient sample of (60) woman with breast cancer is recruited from the community. Two instruments, psychological distress scale and coping scale are developed for the study. Internal consistency reliability and content validity are obtained for the study instruments. Data are collect through the application of the study instruments. Data are analyzed through the use of descriptive statistical data analysis approach and inferential statistical data analysis approach.
Results: The study findings depict that women with breast cancer have experien
... Show MoreThis research presents an experimental investigation of the rehabilitation efficiency of the damaged hybrid reinforced concrete beams with openings in the shear region. The study investigates the difference in retrofitting ability of hybrid beams compared to traditional beams and the effect of two openings compared with one opening equalized to two holes in the area. Five RC beams classified into two groups, A and B, were primarily tested to full-failure under two-point loads. The first group (A) contained beams with normal weight concrete. The second group (hybrid) included beams with lightweight concrete for web and bottom flange, whereas the top flange was made from normal concrete. Two types of openings were considered in this s
... Show MoreUsing the Internet, nothing is secure and as we are in need of means of protecting our data, the use of passwords has become important in the electronic world. To ensure that there is no hacking and to protect the database that contains important information such as the ID card and banking information, the proposed system stores the username after hashing it using the 256 hash algorithm and strong passwords are saved to repel attackers using one of two methods: -The first method is to add a random salt to the password using the CSPRNG algorithm, then hash it using hash 256 and store it on the website. -The second method is to use the PBKDF2 algorithm, which salts the passwords and extends them (deriving the password) before being ha
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreEx-situ bioremediation of 2,4-D herbicide-contaminated soil was studied using a slurry bioreactor operate at aerobic conditions. The performance of the slurry bioreactor was tested for three types of soil (sand, sandy loam and clay) contaminated with different concentration of 2,4-D, 200,300and500mg/kg soil. Sewage sludge was used as an inexpensive source of microorganisms which is available in large quantities in wastewater treatment plants. The results show that all biodegradation experiments demonstrated a significant decreases in 2,4-D concentration in the tested soils. The degradation efficiency in the slurry bioreactor decreases as the initial concentration of 2,4-D in the soils increases.A 100 % removal was achieved at initial con
... Show MoreHome New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet tr
... Show More