The Music of Silence: Part II: Music Listening Induces Imagery Responses
During music listening, humans routinely acquire the regularities of the acoustic sequences and use them to anticipate and
interpret the ongoing melody. Specifically, in line with this predictive framework, it is thought that brain responses during
such listening reflect a comparison between the bottom-up sensory responses and top-down prediction signals generated by
an internal model that embodies the music exposure and expectations of the listener. To attain a clear view of these predictive responses, previous work has eliminated the sensory inputs by inserting artificial silences (or sound omissions) that leave
behind only the corresponding predictions of the thwarted expectations. Here, we demonstrate a new alternate approach in
which we decode the predictive electroencephalography (EEG) responses to the silent intervals that are naturally interspersed
within the music. We did this as participants (experiment 1, 20 participants, 10 female; experiment 2, 21 participants, 6
female) listened or imagined Bach piano melodies. Prediction signals were quantified and assessed via a computational model
of the melodic structure of the music and were shown to exhibit the same response characteristics when measured during listening or imagining. These include an inverted polarity for both silence and imagined responses relative to listening, as well
as response magnitude modulations that precisely reflect the expectations of notes and more »
- Award ID(s):
- 1824198
- Publication Date:
- NSF-PAR ID:
- 10309543
- Journal Name:
- The journal of neuroscience
- Volume:
- 41
- Issue:
- 35
- ISSN:
- 0270-6474
- Sponsoring Org:
- National Science Foundation
More Like this
-
The Music of Silence: Part I: Responses to Musical Imagery Encode Melodic Expectations and AcousticsMusical imagery is the voluntary internal hearing of music in the mind without the need for physical action or external stimulation. Numerous studies have already revealed brain areas activated during imagery. However, it remains unclear to what extent imagined music responses preserve the detailed temporal dynamics of the acoustic stimulus envelope and, crucially, whether melodic expectations play any role in modulating responses to imagined music, as they prominently do during listening. These modulations are important as they reflect aspects of the human musical experience, such as its acquisition, engagement, and enjoyment. This study explored the nature of these modulations inmore »
-
Humans engagement in music rests on underlying elements such as the listeners’ cultural background and interest in music. These factors modulate how listeners anticipate musical events, a process inducing instantaneous neural responses as the music confronts these expectations. Measuring such neural correlates would represent a direct window into high-level brain processing. Here we recorded cortical signals as participants listened to Bach melodies. We assessed the relative contributions of acoustic versus melodic components of the music to the neural signal. Melodic features included information on pitch progressions and their tempo, which were extracted from a predictive model of musical structure basedmore »
-
Parallel processing in speech perception with local and global representations of linguistic contextSpeech processing is highly incremental. It is widely accepted that human listeners continuously use the linguistic context to anticipate upcoming concepts, words, and phonemes. However, previous evidence supports two seemingly contradictory models of how a predictive context is integrated with the bottom-up sensory input: Classic psycholinguistic paradigms suggest a two-stage process, in which acoustic input initially leads to local, context-independent representations, which are then quickly integrated with contextual constraints. This contrasts with the view that the brain constructs a single coherent, unified interpretation of the input, which fully integrates available information across representational hierarchies, and thus uses contextual constraints tomore »
-
Limb dominance is evident in many daily activities, leading to the prominent idea that each hemisphere of the brain specializes in controlling different aspects of movement. Past studies suggest that the dominant arm is primarily controlled via an internal model of limb dynamics that enables the nervous system to produce efficient movements. In contrast, the nondominant arm may be primarily controlled via impedance mechanisms that rely on the strong modulation of sensory feedback from individual joints to control limb posture. We tested whether such differences are evident in behavioral responses and stretch reflexes following sudden displacement of the arm duringmore »
-
A body of research demonstrates convincingly a role for synchronization of auditory cortex to rhythmic structure in sounds including speech and music. Some studies hypothesize that an oscillator in auditory cortex could underlie important temporal processes such as segmentation and prediction. An important critique of these findings raises the plausible concern that what is measured is perhaps not an oscillator but is instead a sequence of evoked responses. The two distinct mechanisms could look very similar in the case of rhythmic input, but an oscillator might better provide the computational roles mentioned above (i.e., segmentation and prediction). We advance anmore »