skip to main content


Title: Integration of event experiences to build relational knowledge in the human brain
Abstract

We investigated how the human brain integrates experiences of specific events to build general knowledge about typical event structure. We examined an episodic memory area important for temporal relations, anterior-lateral entorhinal cortex, and a semantic memory area important for action concepts, middle temporal gyrus, to understand how and when these areas contribute to these processes. Participants underwent functional magnetic resonance imaging while learning and recalling temporal relations among novel events over two sessions 1 week apart. Across distinct contexts, individual temporal relations among events could either be consistent or inconsistent with each other. Within each context, during the recall phase, we measured associative coding as the difference of multivoxel correlations among related vs unrelated pairs of events. Neural regions that form integrative representations should exhibit stronger associative coding in the consistent than the inconsistent contexts. We found evidence of integrative representations that emerged quickly in anterior-lateral entorhinal cortex (at session 1), and only subsequently in middle temporal gyrus, which showed a significant change across sessions. A complementary pattern of findings was seen with signatures during learning. This suggests that integrative representations are established early in anterior-lateral entorhinal cortex and may be a pathway to the later emergence of semantic knowledge in middle temporal gyrus.

 
more » « less
Award ID(s):
2022685
NSF-PAR ID:
10434898
Author(s) / Creator(s):
; ;
Publisher / Repository:
Oxford University Press
Date Published:
Journal Name:
Cerebral Cortex
Volume:
33
Issue:
18
ISSN:
1047-3211
Format(s):
Medium: X Size: p. 9997-10012
Size(s):
p. 9997-10012
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Encoding an event that overlaps with a previous experience may involve reactivating an existing memory and integrating it with new information or suppressing the existing memory to promote formation of a distinct, new representation. We used fMRI during overlapping event encoding to track reactivation and suppression of individual, related memories. We further used a model of semantic knowledge based on Wikipedia to quantify both reactivation of semantic knowledge related to a previous event and formation of integrated memories containing semantic features of both events. Representational similarity analysis revealed that reactivation of semantic knowledge related to a prior event in posterior medial prefrontal cortex (pmPFC) supported memory integration during new learning. Moreover, anterior hippocampus (aHPC) formed integrated representations combining the semantic features of overlapping events. We further found evidence that aHPC integration may be modulated on a trial-by-trial basis by interactions between ventrolateral PFC and anterior mPFC, with suppression of item-specific memory representations in anterior mPFC inhibiting hippocampal integration. These results suggest that PFC-mediated control processes determine the availability of specific relevant memories during new learning, thus impacting hippocampal memory integration.

     
    more » « less
  2. Abstract

    Converging, cross-species evidence indicates that memory for time is supported by hippocampal area CA1 and entorhinal cortex. However, limited evidence characterizes how these regions preserve temporal memories over long timescales (e.g., months). At long timescales, memoranda may be encountered in multiple temporal contexts, potentially creating interference. Here, using 7T fMRI, we measured CA1 and entorhinal activity patterns as human participants viewed thousands of natural scene images distributed, and repeated, across many months. We show that memory for an image’s original temporal context was predicted by the degree to which CA1/entorhinal activity patterns from the first encounter with an image were re-expressed during re-encounters occurring minutes to months later. Critically, temporal memory signals were dissociable from predictors of recognition confidence, which were carried by distinct medial temporal lobe expressions. These findings suggest that CA1 and entorhinal cortex preserve temporal memories across long timescales by coding for and reinstating temporal context information.

     
    more » « less
  3. Episodic memories are records of personally experienced events, coded neurally via the hippocampus and sur- rounding medial temporal lobe cortex. Information about the neural signal corresponding to a memory representation can be measured in fMRI data when the pattern across voxels is examined. Prior studies have found that similarity in the voxel patterns across repetition of a to-be-remembered stimulus predicts later memory retrieval, but the results are inconsistent across studies. The current study investigates the possibility that cognitive goals (defined here via the task instructions given to participants) during encoding affect the voxel pattern that will later support memory retrieval, and therefore that neural representations cannot be interpreted based on the stimulus alone. The behavioral results showed that exposure to variable cognitive tasks across repetition of events benefited subsequent memory retrieval. Voxel patterns in the hippocampus indicated a significant interaction between cognitive tasks (variable vs. consistent) and memory (remembered vs. forgotten) such that reduced voxel pattern similarity for repeated events with variable cognitive tasks, but not consistent cognitive tasks, sup- ported later memory success. There was no significant interaction in neural pattern similarity between cognitive tasks and memory success in medial temporal cortices or lateral occipital cortex. Instead, higher similarity in voxel patterns in right medial temporal cortices was associated with later memory retrieval, regardless of cognitive task. In conclusion, we found that the relationship between pattern similarity across repeated encoding and memory success in the hippocampus (but not medial temporal lobe cortex) changes when the cognitive task during encoding does or does not vary across repetitions of the event. 
    more » « less
  4. Neuroimaging studies of human memory have consistently found that univariate responses in parietal cortex track episodic experience with stimuli (whether stimuli are 'old' or 'new'). More recently, pattern-based fMRI studies have shown that parietal cortex also carries information about the semantic content of remembered experiences. However, it is not well understood how memory-based and content-based signals are integrated within parietal cortex. Here, in humans (males and females), we used voxel-wise encoding models and a recognition memory task to predict the fMRI activity patterns evoked by complex natural scene images based on (1) the episodic history and (2) the semantic content of each image. Models were generated and compared across distinct subregions of parietal cortex and for occipitotemporal cortex. We show that parietal and occipitotemporal regions each encode memory and content information, but they differ in how they combine this information. Among parietal subregions, angular gyrus was characterized by robust and overlapping effects of memory and content. Moreover, subject-specific semantic tuning functions revealed that successful recognition shifted the amplitude of tuning functions in angular gyrus but did not change the selectivity of tuning. In other words, effects of memory and content were additive in angular gyrus. This pattern of data contrasted with occipitotemporal cortex where memory and content effects were interactive: memory effects were preferentially expressed by voxels tuned to the content of a remembered image. Collectively, these findings provide unique insight into how parietal cortex combines information about episodic memory and semantic content.

    SIGNIFICANCE STATEMENTNeuroimaging studies of human memory have identified multiple brain regions that not only carry information about “whether” a visual stimulus is successfully recognized but also “what” the content of that stimulus includes. However, a fundamental and open question concerns how the brain integrates these two types of information (memory and content). Here, using a powerful combination of fMRI analysis methods, we show that parietal cortex, particularly the angular gyrus, robustly combines memory- and content-related information, but these two forms of information are represented via additive, independent signals. In contrast, memory effects in high-level visual cortex critically depend on (and interact with) content representations. Together, these findings reveal multiple and distinct ways in which the brain combines memory- and content-related information.

     
    more » « less
  5. The human medial temporal lobe (MTL) plays a crucial role in recognizing visual objects, a key cognitive function that relies on the formation of semantic representations. Nonetheless, it remains unknown how visual information of general objects is translated into semantic representations in the MTL. Furthermore, the debate about whether the human MTL is involved in perception has endured for a long time. To address these questions, we investigated three distinct models of neural object coding—semantic coding, axis-based feature coding, and region-based feature coding—in each subregion of the MTL, using high-resolution fMRI in two male and six female participants. Our findings revealed the presence of semantic coding throughout the MTL, with a higher prevalence observed in the parahippocampal cortex (PHC) and perirhinal cortex (PRC), while axis coding and region coding were primarily observed in the earlier regions of the MTL. Moreover, we demonstrated that voxels exhibiting axis coding supported the transition to region coding and contained information relevant to semantic coding. Together, by providing a detailed characterization of neural object coding schemes and offering a comprehensive summary of visual coding information for each MTL subregion, our results not only emphasize a clear role of the MTL in perceptual processing but also shed light on the translation of perception-driven representations of visual features into memory-driven representations of semantics along the MTL processing pathway.

    Significance StatementIn this study, we delved into the mechanisms underlying visual object recognition within the human medial temporal lobe (MTL), a pivotal region known for its role in the formation of semantic representations crucial for memory. In particular, the translation of visual information into semantic representations within the MTL has remained unclear, and the enduring debate regarding the involvement of the human MTL in perception has persisted. To address these questions, we comprehensively examined distinct neural object coding models across each subregion of the MTL, leveraging high-resolution fMRI. We also showed transition of information between object coding models and across MTL subregions. Our findings significantly contributes to advancing our understanding of the intricate pathway involved in visual object coding.

     
    more » « less