Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals. 
                        more » 
                        « less   
                    
                            
                            Attention Drives Emotion: Voluntary Visual Attention Increases Perceived Emotional Intensity
                        
                    
    
            Attention and emotion are fundamental psychological systems. It is well established that emotion intensifies attention. Three experiments reported here ( N = 235) demonstrated the reversed causal direction: Voluntary visual attention intensifies perceived emotion. In Experiment 1, participants repeatedly directed attention toward a target object during sequential search. Participants subsequently perceived their emotional reactions to target objects as more intense than their reactions to control objects. Experiments 2 and 3 used a spatial-cuing procedure to manipulate voluntary visual attention. Spatially cued attention increased perceived emotional intensity. Participants perceived spatially cued objects as more emotionally intense than noncued objects even when participants were asked to mentally rehearse the name of noncued objects. This suggests that the intensifying effect of attention is independent of more extensive mental rehearsal. Across experiments, attended objects were perceived as more visually distinctive, which statistically mediated the effects of attention on emotional intensity. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1730611
- PAR ID:
- 10549242
- Publisher / Repository:
- SAGE Publications
- Date Published:
- Journal Name:
- Psychological Science
- Volume:
- 30
- Issue:
- 6
- ISSN:
- 0956-7976
- Format(s):
- Medium: X Size: p. 942-954
- Size(s):
- p. 942-954
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Attention is the ability to focus one's awareness on relevant events and objects while ignoring distracting ones. Laboratory studies of top-down voluntary attention commonly use predictive or instructional cues to direct attention. However, in real world scenarios, voluntary attention is not necessarily externally cued, but may be focused by internal, self-generated processes. The voluntary focusing of attention in the absence of external guidance has been referred to as “willed attention,” a term borrowed from the literature on willed motor actions. In a fashion similar to studies of willed (self-initiated) actions, during willed attention, participants are given the freedom to deploy attention based on their own free choices. Electrophysiological studies have shown that during willed attention, ongoing neural activity biases willed attention decisions on a moment-to-moment basis as reflected in transient patterns of brain electrical activity that predict where participants will later choose to focus their attention. Brain imaging studies have revealed that compared to cued attention, willed attention involves additional frontal cortical structures, which interact with the classic attentional control networks of the human brain to produce a modified network organization for willed attention control. In this introduction to willed attention, we briefly review the fields of voluntary attention and self-initiated motor actions, in order to describe willed attention and its neural correlates as they relate to the broader concepts of attention and volition.more » « less
- 
            Despite significant vision loss, humans can still recognize various emotional stimuli via a sense of hearing and express diverse emotional responses, which can be sorted into two dimensions, arousal and valence. Yet, many research studies have been focusing on sighted people, leading to lack of knowledge about emotion perception mechanisms of people with visual impairment. This study aims at advancing knowledge of the degree to which people with visual impairment perceive various emotions – high/low arousal and positive/negative emotions. A total of 30 individuals with visual impairment participated in interviews where they listened to stories of people who became visually impaired, encountered and overcame various challenges, and they were instructed to share their emotions. Participants perceived different kinds and intensities of emotions, depending on their demographic variables such as living alone, loneliness, onset of visual impairment, visual acuity, race/ethnicity, and employment status. The advanced knowledge of emotion perceptions in people with visual impairment is anticipated to contribute toward better designing social supports that can adequately accommodate those with visual impairment.more » « less
- 
            This paper demonstrates the utility of ambient-focal attention and pupil dilation dynamics to describe visual processing of emotional facial expressions. Pupil dilation and focal eye movements reflect deeper cognitive processing and thus shed more light on the dy- namics of emotional expression recognition. Socially anxious in- dividuals (N = 24) and non-anxious controls (N = 24) were asked to recognize emotional facial expressions that gradually morphed from a neutral expression to one of happiness, sadness, or anger in 10-sec animations. Anxious cohorts exhibited more ambient face scanning than their non-anxious counterparts. We observed a positive relationship between focal fixations and pupil dilation, indi- cating deeper processing of viewed faces, but only by non-anxious participants, and only during the last phase of emotion recognition. Group differences in the dynamics of ambient-focal attention sup- port the hypothesis of vigilance to emotional expression processing by socially anxious individuals. We discuss the results by referring to current literature on cognitive psychopathology.more » « less
- 
            In models of visual spatial attention control, it is commonly held that top–down control signals originate in the dorsal attention network, propagating to the visual cortex to modulate baseline neural activity and bias sensory processing. However, the precise distribution of these top–down influences across different levels of the visual hierarchy is debated. In addition, it is unclear whether these baseline neural activity changes translate into improved performance. We analyzed attention-related baseline activity during the anticipatory period of a voluntary spatial attention task, using two independent functional magnetic resonance imaging datasets and two analytic approaches. First, as in prior studies, univariate analysis showed that covert attention significantly enhanced baseline neural activity in higher-order visual areas contralateral to the attended visual hemifield, while effects in lower-order visual areas (e.g., V1) were weaker and more variable. Second, in contrast, multivariate pattern analysis (MVPA) revealed significant decoding of attention conditions across all visual cortical areas, with lower-order visual areas exhibiting higher decoding accuracies than higher-order areas. Third, decoding accuracy, rather than the magnitude of univariate activation, was a better predictor of a subject's stimulus discrimination performance. Finally, the MVPA results were replicated across two experimental conditions, where the direction of spatial attention was either externally instructed by a cue or based on the participants’ free choice decision about where to attend. Together, these findings offer new insights into the extent of attentional biases in the visual hierarchy under top–down control and how these biases influence both sensory processing and behavioral performance.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
