skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Influence of Environmental Context on Recognition Rates of Stylized Walking Sequences
Affective movement will likely be an important component of robotic interaction as more and more robots move into human-facing scenarios where humans are (consciously or unconsciously) constantly monitoring the motion profile of counterparts in order to make judgments about the state of their counterpart. Many current studies in affective movement recognition and generation seek to either increase a machine’s ability to correctly identify human affect or to identify and create components of robotic movement that enhance human perception. However, very few of these studies investigate the influence of environmental context on a machine’s ability to correctly identity human affect or a human’s ability to correctly identify the affective intent of a robot. This paper presents the results of a user study that investigated how human perception of stylized walking sequences (created in [1]) varied based on the environment where they were portrayed. The results show that environment context can impact a person’s ability to correctly perceive the intended style of a movement.  more » « less
Award ID(s):
1701295
PAR ID:
10080429
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Social Robotics. ICSR 2017. Lecture Notes in Computer Science
Volume:
10652
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. “Sickness behavior” is an orchestrated suite of symptoms that commonly occur in the context of inflammation, and is characterized by changes in affect, social experience, and behavior. However, recent evidence suggests that inflammation may not always produce the same set of sickness behavior (e.g., fatigue, anhedonia, and social withdrawal). Rather, inflammation may be linked with different behavior across contexts and/or across in- dividuals, though research in this area is under-developed to-date. In the present study (n = 30), we evaluated the influence of affective context and individual differences in difficulty detecting bodily sensations (i.e., interoceptive difficulty) on social perception following an inflammatory challenge. Inflammation was induced using the influenza vaccine and inflammatory reactivity was operationalized as changes in circulating levels of interleukin-6 (IL-6) before the vaccine and approximately 24 h later. Twenty-four hours after administration of the influenza vaccine, we manipulated affective context using a well-validated affect misattribution task in which participants made trustworthiness judgments of individuals with neutral facial expressions following the rapid presentation of “prime” images that were positive or negative in affective content. Interoceptive difficulty was measured at baseline using a validated self-report measure. Results revealed significant interactions between inflammatory reactivity to the influenza vaccine and affective context on social perception. Specifically, in- dividuals with greater inflammatory reactivity were more biased by affective context when judging the trust- worthiness of neutral faces. In addition, interoceptive difficulty and affective context interacted to predict social perception such that individuals with greater interoceptive difficulty were more biased by affective context in these judgments. In sum, we provide some of the first evidence that inflammation may amplify the saliency of affective cues during social decision making. Our findings also replicate prior work linking interoceptive ability to the use of affect-as-information during social perception, but in the novel context of inflammation. 
    more » « less
  2. na (Ed.)
    In the field of affective computing, emotional annotations are highly important for both the recognition and synthesis of human emotions. Researchers must ensure that these emotional labels are adequate for modeling general human perception. An unavoidable part of obtaining such labels is that human annotators are exposed to known and unknown stimuli before and during the annotation process that can affect their perception. Emotional stimuli cause an affective priming effect, which is a pre-conscious phenomenon in which previous emotional stimuli affect the emotional perception of a current target stimulus. In this paper, we use sequences of emotional annotations during a perceptual evaluation to study the effect of affective priming on emotional ratings of speech. We observe that previous emotional sentences with extreme emotional content push annotations of current samples to the same extreme. We create a sentence-level bias metric to study the effect of affective priming on speech emotion recognition (SER) modeling. The metric is used to identify subsets in the database with more affective priming bias intentionally creating biased datasets. We train and test SER models using the full and biased datasets. Our results show that although the biased datasets have low inter-evaluator agreements, SER models for arousal and dominance trained with those datasets perform the best. For valence, the models trained with the less-biased datasets perform the best. 
    more » « less
  3. Although early concepts of risk perception measures distinguished cognitive from affective items, until recently multi-dimensional taxonomies were absent from risk perception studies, and even more from tests of their association with behavior or policy support. Six longitudinal panel surveys on U.S. COVID-19 views (n = 2004 February 2020, ending April 2021) allowed testing of these relationships among ≤ 10 risk perception items measured in each wave. Confirmatory factor analyses revealed consistent distinctions between personal (conditioning perceived risk on taking further or no further protective action), collective (U.S., global), affective (concern, dread), and severity (estimates of eventual total U.S. infections and deaths) measures, while affect (good-bad feelings) and duration (how long people expect the outbreak to last) did not fit with their assumed affective and severity (respectively) parallels. Collective and affective/affect risk perceptions most strongly predicted both behavioral intentions and policy support for mask wearing, avoidance of large public gatherings, and vaccination, controlling for personal risk perception (which might be partly reflected in the affective/affect effects) and other measures. These findings underline the importance of multi-dimensionality (e.g. not just asking about personal risk perceptions) in designing risk perception research, even when trying to explain personal protective actions. 
    more » « less
  4. The Spatial Audio Data Immersive Experience (SADIE) project aims to identify new foundational relationships pertaining to human spatial aural perception, and to validate existing relationships. Our infrastructure consists of an intuitive interaction interface, an immersive exocentric sonification environment, and a layer-based amplitude-panning algorithm. Here we highlight the systemメs unique capabilities and provide findings from an initial externally funded study that focuses on the assessment of human aural spatial perception capacity. When compared to the existing body of literature focusing on egocentric spatial perception, our data show that an immersive exocentric environment enhances spatial perception, and that the physical implementation using high density loudspeaker arrays enables significantly improved spatial perception accuracy relative to the egocentric and virtual binaural approaches. The preliminary observations suggest that human spatial aural perception capacity in real-world-like immersive exocentric environments that allow for head and body movement is significantly greater than in egocentric scenarios where head and body movement is restricted. Therefore, in the design of immersive auditory displays, the use of immersive exocentric environments is advised. Further, our data identify a significant gap between physical and virtual human spatial aural perception accuracy, which suggests that further development of virtual aural immersion may be necessary before such an approach may be seen as a viable alternative. 
    more » « less
  5. Agents must monitor their partners' affective states continuously in order to understand and engage in social interactions. However, methods for evaluating affect recognition do not account for changes in classification performance that may occur during occlusions or transitions between affective states. This paper addresses temporal patterns in affect classification performance in the context of an infant-robot interaction, where infants’ affective states contribute to their ability to participate in a therapeutic leg movement activity. To support robustness to facial occlusions in video recordings, we trained infant affect recognition classifiers using both facial and body features. Next, we conducted an in-depth analysis of our best-performing models to evaluate how performance changed over time as the models encountered missing data and changing infant affect. During time windows when features were extracted with high confidence, a unimodal model trained on facial features achieved the same optimal performance as multimodal models trained on both facial and body features. However, multimodal models outperformed unimodal models when evaluated on the entire dataset. Additionally, model performance was weakest when predicting an affective state transition and improved after multiple predictions of the same affective state. These findings emphasize the benefits of incorporating body features in continuous affect recognition for infants. Our work highlights the importance of evaluating variability in model performance both over time and in the presence of missing data when applying affect recognition to social interactions. 
    more » « less