Recent years have seen an explosion in graph data from a variety of scientific, social and technological fields. From these fields, emotion recognition is an interesting research area because it finds many applications in real life such as in effective social robotics to increase the interactivity of the robot with human, driver safety during driving, pain monitoring during surgery etc. A novel facial emotion recognition based on graph mining has been proposed in this paper to make a paradigm shift in the way of representing the face region, where the face region is represented as a graph of nodes and edges and the gSpan frequent sub-graphs mining algorithm is used to find the frequent sub-structures in the graph database of each emotion. To reduce the number of generated sub-graphs, overlap ratio metric is utilized for this purpose. After encoding the final selected sub-graphs, binary classification is then applied to classify the emotion of the queried input facial image using six levels of classification. Binary cat swarm intelligence is applied within each level of classification to select proper sub-graphs that give the highest accuracy in that level. Different experiments have been conducted using Surrey Audio-Visual Expressed Emotion (SAVEE) database and the final system accuracy was 90.00%. The results show significant accuracy improvements (about 2%) by the proposed system in comparison to current published works in SAVEE database.
Experimental measurements of viscosity and thermal conductivity of single layer of graphene . based DI-water nanofluid are performed as a function of concentrations (0.1-1wt%) and temperatures between (5 to 35ºC). The result reveals that the thermal conductivity of GNPs nanofluids was increased with increasing the nanoparticle weight fraction concentration and temperature, while the maximum enhancement was about 22% for concentration of 1 wt.% at
35ºC. These experimental results were compared with some theoretical models and a good agreement between Nan’s model and the experimental results was observed. The viscosity of the graphene nanofluid displays Newtonian and Non-Newtonian behaviors with respect to nanoparticles concen
Background: Polymeric composites have been widely used as dental restorative materials. A fundamental knowledge and understanding of the behavior of these materials in the oral cavity is essential to improve their properties and performance. The goal of this study was to measure water sorption of four composite resins containing different filler and resin matrix contents. Materials and method: Resin composite specimens giomer (Beautifil II) Filtek™ P90, Filtek™ Z350 XT, and Tetric N Ceram were prepared in a cylindrical mould of 3mm thickness and 6mm diameter (n=10) and light cured . All specimens placed in silica-gel desiccators at 37˚C for seven days, a constant weight was obtained. All samples were immersed in deionized distill
... Show MoreThe first aim of this paper was to evaluate the push-out bond strength of the gutta-percha coating of Thermafil and GuttaCore and compare it with that of gutta-percha used to coat an experimental hydroxyapatite/polyethylene (HA/PE) obturator. The second aim was to assess the thickness of gutta-percha around the carriers of GuttaCore and HA/PE obturators using microcomputed tomography (
I
In this study, optical fibers were designed and implemented as a chemical sensor based on surface plasmon resonance (SPR) to estimate the age of the oil used in electrical transformers. The study depends on the refractive indices of the oil. The sensor was created by embedding the center portion of the optical fiber in a resin block, followed by polishing, and tapering to create the optical fiber sensor. The tapering time was 50 min. The multi-mode optical fiber was coated with 60 nm thickness gold metal. The deposition length was 4 cm. The sensor's resonance wavelength was 415 nm. The primary sensor parameters were calculated, including sensitivity (6.25), signal-to-noise ratio (2.38), figure of merit (4.88), and accuracy (3.2)
... Show MoreBackground:Measurement of hemoglobin A1c (A1C) is a renowned tactic for gauging long-term glycemic control, and exemplifies an outstanding influence to the quality of care in diabetic patients.The concept of targets is open to criticism; they may be unattainable, or limit what could be attained, and in addition they may be economically difficult to attain. However, without some form of targeted control of an asymptomatic condition it becomes difficult to promote care at allObjectives: The present article aims to address the most recent evidence-based global guidelines of A1C targets intended for glycemic control in Type 2 Diabetes Mellitus (T2D).Key messages:Rationale for Treatment Targets of A1C includesevidence for microvascular and ma
... Show MoreWith the increasing integration of computers and smartphones into our daily lives, in addition to the numerous benefits it offers over traditional paper-based methods of conducting affairs, it has become necessary to incorporate one of the most essential facilities into this integration; namely: colleges. The traditional approach for conducting affairs in colleges is mostly paper-based, which only increases time and workload and is relatively decentralized. This project provides educational and management services for the university environment, targeting the staff, the student body, and the lecturers, on two of the most used platforms: smartphones and reliable web applications by clo
Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and e
... Show MoreComputer models are used in the study of electrocardiography to provide insight into physiological phenomena that are difficult to measure in the lab or in a clinical environment.
The electrocardiogram is an important tool for the clinician in that it changes characteristically in a number of pathological conditions. Many illnesses can be detected by this measurement. By simulating the electrical activity of the heart one obtains a quantitative relationship between the electrocardiogram and different anomalies.
Because of the inhomogeneous fibrous structure of the heart and the irregular geometries of the body, finite element method is used for studying the electrical properties of the heart.
This work describes t
... Show More