A hand gesture recognition system provides a robust and innovative solution to nonverbal communication through human–computer interaction. Deep learning models have excellent potential for usage in recognition applications. To overcome related issues, most previous studies have proposed new model architectures or have fine-tuned pre-trained models. Furthermore, these studies relied on one standard dataset for both training and testing. Thus, the accuracy of these studies is reasonable. Unlike these works, the current study investigates two deep learning models with intermediate layers to recognize static hand gesture images. Both models were tested on different datasets, adjusted to suit the dataset, and then trained under different methods. First, the models were initialized with random weights and trained from scratch. Afterward, the pre-trained models were examined as feature extractors. Finally, the pre-trained models were fine-tuned with intermediate layers. Fine-tuning was conducted on three levels: the fifth, fourth, and third blocks, respectively. The models were evaluated through recognition experiments using hand gesture images in the Arabic sign language acquired under different conditions. This study also provides a new hand gesture image dataset used in these experiments, plus two other datasets. The experimental results indicated that the proposed models can be used with intermediate layers to recognize hand gesture images. Furthermore, the analysis of the results showed that fine-tuning the fifth and fourth blocks of these two models achieved the best accuracy results. In particular, the testing accuracies on the three datasets were 96.51%, 72.65%, and 55.62% when fine-tuning the fourth block and 96.50%, 67.03%, and 61.09% when fine-tuning the fifth block for the first model. The testing accuracy for the second model showed approximately similar results.
The aim of the study is to assess the risk factors which lead to myocardial infarction and relation to some variables. The filed study was carried out from the 1st of April to the end of Sept. 2005. The Sample of the study consisted of (100) patients in lbn-Albeetar and Baghdad Teaching Hospital. The result of the study indicated the following; 45% of patients with age group (41-50) were more exposed to the disease and there is no significant difference was seen in the level of education, Martial status, weight and height. The result shows that there are significant difference in risk factors like hypertension, cholesterol level in blood and diabetes. When analyzed by T.test at level of P < 0.01 and there are significant difference in smoki
... Show MoreA comparison of double informative and non- informative priors assumed for the parameter of Rayleigh distribution is considered. Three different sets of double priors are included, for a single unknown parameter of Rayleigh distribution. We have assumed three double priors: the square root inverted gamma (SRIG) - the natural conjugate family of priors distribution, the square root inverted gamma – the non-informative distribution, and the natural conjugate family of priors - the non-informative distribution as double priors .The data is generating form three cases from Rayleigh distribution for different samples sizes (small, medium, and large). And Bayes estimators for the parameter is derived under a squared erro
... Show MoreCancer is in general not a result of an abnormality of a single gene but a consequence of changes in many genes, it is therefore of great importance to understand the roles of different oncogenic and tumor suppressor pathways in tumorigenesis. In recent years, there have been many computational models developed to study the genetic alterations of different pathways in the evolutionary process of cancer. However, most of the methods are knowledge-based enrichment analyses and inflexible to analyze user-defined pathways or gene sets. In this paper, we develop a nonparametric and data-driven approach to testing for the dynamic changes of pathways over the cancer progression. Our method is based on an expansion and refinement of the pathway bei
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreObjectives: To determine the (QoL) for patients with permanent pacemaker and to find-out the relationship between
these patients’ (QoL) and their sociodemographic characteristics such as age, gender, level of education, and
occupation.
Methodology: ٨ purposive non-probability” sample of (62) patient with permanent pacemaker was involved in this
study. The developed questionnaire consists of (4) parts which include !.demographic data form, 2.disease-related
information form, 3.socioeconomic data form, and 4.Permanent pacemaker patient’s quality of life questionnaire data
form. The validity and reliability of the questionnaire were determined through the application of a pilot study. ٨
descriptive statistical a
The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreThis paper presents a research for magnetohydrodynamic (MHD) flow of an incompressible generalized Burgers’ fluid including by an accelerating plate and flowing under the action of pressure gradient. Where the no – slip assumption between the wall and the fluid is no longer valid. The fractional calculus approach is introduced to establish the constitutive relationship of the generalized Burgers’ fluid. By using the discrete Laplace transform of the sequential fractional derivatives, a closed form solutions for the velocity and shear stress are obtained in terms of Fox H- function for the following two problems: (i) flow due to a constant pressure gradient, and (ii) flow due to due to a sinusoidal pressure gradient. The solutions for
... Show MoreMPEG-DASH is an adaptive bitrate streaming technology that divides video content into small HTTP-objects file segments with different bitrates. With live UHD video streaming latency is the most important problem. In this paper, creating a low-delay streaming system using HTTP 2.0. Based on the network condition the proposed system adaptively determine the bitrate of segments. The video is coded using a layered H.265/HEVC compression standard, then is tested to investigate the relationship between video quality and bitrate for various HEVC parameters and video motion at each layer/resolution. The system architecture includes encoder/decoder configurations and how to embedded the adaptive video streaming. The encoder includes compression besi
... Show More