skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Saliency Maps of Images of Facial Disfigurements from Head and Neck Cancer
Introduction: Head and neck cancer (HNC) and its treatment can result in facial disfigurement and functional defects in speech, swallowing, and vision that persist after reconstructive surgery. Body image concerns are pervasive among HNC patients, and a large portion of these concerns stem from worries about social interaction. Our overarching goal is to develop normative interventions to inform HNC patients about how others will respond to the changes in their facial appearance. In this study, we investigated saliency map algorithms for highlighting regions of interest on a clinically disfigured face that are expected to draw an observer’s eye based on color, intensity, etc.  more » « less
Award ID(s):
1757885
PAR ID:
10138552
Author(s) / Creator(s):
; ; ; ; ; ; ;
Date Published:
Journal Name:
2019 BMES Conference Proceedings - REU Abstract Accepted Poster
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    The rapid growth of facial recognition technology across ever more diverse contexts calls for a better understanding of how people feel about these deployments — whether they see value in them or are concerned about their privacy, and to what extent they have generally grown accustomed to them. We present a qualitative analysis of data gathered as part of a 10-day experience sampling study with 123 participants who were presented with realistic deployment scenarios of facial recognition as they went about their daily lives. Responses capturing their attitudes towards these deployments were collected both in situ and through daily evening surveys, in which participants were asked to reflect on their experiences and reactions. Ten follow-up interviews were conducted to further triangulate the data from the study. Our results highlight both the perceived benefits and concerns people express when faced with different facial recognition deployment scenarios. Participants reported concerns about the accuracy of the technology, including possible bias in its analysis, privacy concerns about the type of information being collected or inferred, and more generally, the dragnet effect resulting from the widespread deployment. Based on our findings, we discuss strategies and guidelines for informing the deployment of facial recognition, particularly focusing on ensuring that people are given adequate levels of transparency and control. 
    more » « less
  2. Currently, many critical care indices are repetitively assessed and recorded by overburdened nurses, e.g. physical function or facial pain expressions of nonverbal patients. In addition, many essential information on patients and their environment are not captured at all, or are captured in a non-granular manner, e.g. sleep disturbance factors such as bright light, loud background noise, or excessive visitations. In this pilot study, we examined the feasibility of using pervasive sensing technology and artificial intelligence for autonomous and granular monitoring of critically ill patients and their environment in the Intensive Care Unit (ICU). As an exemplar prevalent condition, we also characterized delirious and non-delirious patients and their environment. We used wearable sensors, light and sound sensors, and a high-resolution camera to collected data on patients and their environment. We analyzed collected data using deep learning and statistical analysis. Our system performed face detection, face recognition, facial action unit detection, head pose detection, facial expression recognition, posture recognition, actigraphy analysis, sound pressure and light level detection, and visitation frequency detection. We were able to detect patient's face (Mean average precision (mAP)=0.94), recognize patient's face (mAP=0.80), and their postures (F1=0.94). We also found that all facial expressions, 11 activity features, visitation frequency during the day, visitation frequency during the night, light levels, and sound pressure levels during the night were significantly different between delirious and non-delirious patients (p-value<0.05). In summary, we showed that granular and autonomous monitoring of critically ill patients and their environment is feasible and can be used for characterizing critical care conditions and related environment factors. 
    more » « less
  3. null (Ed.)
    Facial micro-expressions are spontaneous, subtle, involuntary muscle movements occurring briefly on the face. The spotting and recognition of these expressions are difficult due to the subtle behavior, and the time duration of these expressions is about half a second, which makes it difficult for humans to identify them. These micro-expressions have many applications in our daily life, such as in the field of online learning, game playing, lie detection, and therapy sessions. Traditionally, researchers use RGB images/videos to spot and classify these micro-expressions, which pose challenging problems, such as illumination, privacy concerns and pose variation. The use of depth videos solves these issues to some extent, as the depth videos are not susceptible to the variation in illumination. This paper describes the collection of a first RGB-D dataset for the classification of facial micro-expressions into 6 universal expressions: Anger, Happy, Sad, Fear, Disgust, and Surprise. This paper shows the comparison between the RGB and Depth videos for the classification of facial micro-expressions. Further, a comparison of results shows that depth videos alone can be used to classify facial micro-expressions correctly in a decision tree structure by using the traditional and deep learning approaches with good classification accuracy. The dataset will be released to the public in the near future. 
    more » « less
  4. Abstract Concerns about severe acute respiratory syndrome coronavirus 2 exposure in health care settings may cause patients to delay care. Among 2992 patients testing negative on admission to an academic, 3-hospital system, 8 tested positive during hospitalization or within 14 days postdischarge. Following adjudication of each instance, health care–associated infection incidence ranged from 0.8 to 5.0 cases per 10 000 patient-days. 
    more » « less
  5. Ruthenium has emerged as a promising substitute for platinum toward the hydrogen evolution/oxidation reaction (HER/HOR). Herein, ruthenium/carbon composites are prepared by magnetic induction heating (300 A, 10 s) of RuCl3, RuBr3or RuI3loaded on hollow N‐doped carbon cages (HNC). The HNC‐RuCl3‐300A sample consists of Ru nanoparticles (dia. 1.96 nm) and abundant Cl residues. HNC‐RuBr3‐300A possesses a larger nanoparticle size (≈19.36 nm) and lower content of Br residues. HNC‐RuI3‐300A contains only bulk‐like Ru agglomerates with a minimal amount of I residues, due to reduced Ru‐halide bonding interactions. Among these, HNC‐RuCl3‐300A exhibits the best HER activity in alkaline media, with a low overpotential of only −26 mV to reach 10 mA cm−2, even outperforming Pt/C, and can be used as the cathode catalyst for anion exchange membrane water electrolyzer (along with commercial RuO2as the anode catalyst), producing 0.5 A cm2at 1.88 V for up to 100 h, a performance markedly better than that with Pt/C. HNC‐RuCl3‐300A also exhibits the best HOR activity, with a half‐wave potential (+18 mV) even lower than that of Pt/C (+35 mV). These activities are ascribed to the combined contributions of small Ru nanoparticles and Ru‐to‐halide charge transfer that weaken H adsorption. 
    more » « less