skip to main content


Title: The real‐time effects of parent speech on infants' multimodal attention and dyadic coordination
Abstract

Parental responsiveness to infant behaviors is a strong predictor of infants' language and cognitive outcomes. The mechanisms underlying this effect, however, are relatively unknown. We examined the effects of parent speech on infants' visual attention, manual actions, hand‐eye coordination, and dyadic joint attention during parent‐infant free play. We report on two studies that used head‐mounted eye trackers in increasingly naturalistic laboratory environments. In Study 1, 12‐to‐24‐month‐old infants and their parents played on the floor of a seminaturalistic environment with 24 toys. In Study 2, a different sample of dyads played in a home‐like laboratory with 10 toys and no restrictions on their movement. In both studies, we present evidence that responsive parent speech extends the duration of infants' multimodal attention. This social “boost” of parent speech impacts multiple behaviors that have been linked to later outcomes—visual attention, manual actions, hand‐eye coordination, and joint attention. Further, the amount that parents talked during the interaction was negatively related to the effects of parent speech on infant attention. Together, these results provide evidence of a trade‐off between quantity of speech and its effects, suggesting multiple pathways through which parents impact infants' multimodal attention to shape the moment‐by‐moment dynamics of an interaction.

 
more » « less
NSF-PAR ID:
10443945
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley-Blackwell
Date Published:
Journal Name:
Infancy
Volume:
27
Issue:
6
ISSN:
1525-0008
Page Range / eLocation ID:
p. 1154-1178
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Most research on early language learning focuses on the objects that infants see and the words they hear in their daily lives, although growing evidence suggests that motor development is also closely tied to language development. To study the real‐time behaviors required for learning new words during free‐flowing toy play, we measured infants’ visual attention and manual actions on to‐be‐learned toys. Parents and 12‐to‐26‐month‐old infants wore wireless head‐mounted eye trackers, allowing them to move freely around a home‐like lab environment. After the play session, infants were tested on their knowledge of object‐label mappings. We found that how often parents named objects during play did not predict learning, but instead, it was infants’ attention during and around a labeling utterance that predicted whether an object‐label mapping was learned. More specifically, we found that infant visual attention alone did not predict word learning. Instead, coordinated, multimodal attention–when infants’ hands and eyes were attending to the same object–predicted word learning. Our results implicate a causal pathway through which infants’ bodily actions play a critical role in early word learning.

     
    more » « less
  2. Abstract

    The present article investigated the composition of different joint gaze components used to operationalize various types of coordinated attention between parents and infants and which types of coordinated attention were associated with future vocabulary size. Twenty‐five 9‐month‐old infants and their parents wore head‐mounted eye trackers as they played with objects together. With high‐density gaze data, a variety of coordinated attention bout types were quantitatively measured by combining different gaze components, such as mutual gaze, joint object looks, face looks, and triadic gaze patterns. The key components of coordinated attention that were associated with vocabulary size at 12 and 15 months included the simultaneous combination of parent triadic gaze and infant object looking. The results from this article are discussed in terms of the importance of parent attentional monitoring and infant sustained attention for language development.

     
    more » « less
  3. Abstract Highlights

    We describe a novel, theoretically based model of infant social motivation wherein multiple parent‐reported indicators contribute to a unitary latent social‐motivation factor.

    Analyses revealed social‐motivation factor scores exhibited measurement invariance for a longitudinal sample of infants at high and low familial ASD likelihood.

    Social‐motivation growth from ages 6–12 months is associated with better 12−15‐month joint attention abilities, which in turn are associated with greater 24‐month language skills.

    Findings inform timing and targets of potential interventions to support healthy social communication in the first year of life.

     
    more » « less
  4. Abstract Research Highlights

    In parent‐infant interaction, parents’ referential intentions are sometimes clear and sometimes unclear; likewise, parents’ pronunciation is sometimes clear and sometimes quite difficult to understand.

    We find that clearer referential instances go along with clearer phonetic instances, more so than expected by chance.

    Thus, there are globally valuable instances (“gems”) from which children could learn about words’ pronunciations and words’ meanings at the same time.

    Homing in on clear phonetic instances and filtering out less‐clear ones would help children identify these multimodal “gems” during word learning.

     
    more » « less
  5. Abstract

    Object names are a major component of early vocabularies and learning object names depends on being able to visually recognize objects in the world. However, the fundamental visual challenge of the moment‐to‐moment variations in object appearances that learners must resolve has received little attention in word learning research. Here we provide the first evidence that image‐level object variability matters and may be the link that connects infant object manipulation to vocabulary development. Using head‐mounted eye tracking, the present study objectively measured individual differences in the moment‐to‐moment variability of visual instances of the same object, from infants’ first‐person views. Infants who generated more variable visual object images through manual object manipulation at 15 months of age experienced greater vocabulary growth over the next six months. Elucidating infants’ everyday visual experiences with objects may constitute a crucial missing link in our understanding of the developmental trajectory of object name learning.

     
    more » « less