Episodic memories are records of personally experienced events, coded neurally via the hippocampus and sur- rounding medial temporal lobe cortex. Information about the neural signal corresponding to a memory representation can be measured in fMRI data when the pattern across voxels is examined. Prior studies have found that similarity in the voxel patterns across repetition of a to-be-remembered stimulus predicts later memory retrieval, but the results are inconsistent across studies. The current study investigates the possibility that cognitive goals (defined here via the task instructions given to participants) during encoding affect the voxel pattern that will later support memory retrieval, and therefore that neural representations cannot be interpreted based on the stimulus alone. The behavioral results showed that exposure to variable cognitive tasks across repetition of events benefited subsequent memory retrieval. Voxel patterns in the hippocampus indicated a significant interaction between cognitive tasks (variable vs. consistent) and memory (remembered vs. forgotten) such that reduced voxel pattern similarity for repeated events with variable cognitive tasks, but not consistent cognitive tasks, sup- ported later memory success. There was no significant interaction in neural pattern similarity between cognitive tasks and memory success in medial temporal cortices or lateral occipital cortex. Instead, higher similarity in voxel patterns in right medial temporal cortices was associated with later memory retrieval, regardless of cognitive task. In conclusion, we found that the relationship between pattern similarity across repeated encoding and memory success in the hippocampus (but not medial temporal lobe cortex) changes when the cognitive task during encoding does or does not vary across repetitions of the event.
more »
« less
Effector-independent Representations Guide Sequential Target Selection Biases in Action
Abstract Previous work shows that automatic attention biases toward recently selected target features transfer across action and perception and even across different effectors such as the eyes and hands on a trial-by-trial basis. Although these findings suggest a common neural representation of selection history across effectors, the extent to which information about recently selected target features is encoded in overlapping versus distinct brain regions is unknown. Using fMRI and a priming of pop-out task where participants selected unpredictable, uniquely colored targets among homogeneous distractors via reach or saccade, we show that color priming is driven by shared, effector-independent underlying representations of recent selection history. Consistent with previous work, we found that the intraparietal sulcus (IPS) was commonly activated on trials where target colors were switched relative to those where the colors were repeated; however, the dorsal anterior insula exhibited effector-specific activation related to color priming. Via multivoxel cross-classification analyses, we further demonstrate that fine-grained patterns of activity in both IPS and the medial temporal lobe encode information about selection history in an effector-independent manner, such that ROI-specific models trained on activity patterns during reach selection could predict whether a color was repeated or switched on the current trial during saccade selection and vice versa. Remarkably, model generalization performance in IPS and medial temporal lobe also tracked individual differences in behavioral priming sensitivity across both types of action. These results represent a first step to clarify the neural substrates of experience-driven selection biases in contexts that require the coordination of multiple actions.
more »
« less
- Award ID(s):
- 1849169
- PAR ID:
- 10485872
- Publisher / Repository:
- MIT press
- Date Published:
- Journal Name:
- Journal of Cognitive Neuroscience
- ISSN:
- 0898-929X
- Page Range / eLocation ID:
- 1 to 16
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
For flexible goal-directed behavior, prioritizing and selecting a specific action among multiple candidates is often important. Working memory has long been assumed to play a role in prioritization and planning, while bridging cross-temporal contingencies during action selection. However, studies of working memory have mostly focused on memory for single components of an action plan, such as a rule or a stimulus, rather than management of all of these elements during planning. Therefore, it is not known how post-encoding prioritization and selection operate on the entire profile of representations for prospective actions. Here, we assessed how such control processes unfold over action representations, highlighting the role of conjunctive representations that nonlinearly integrate task-relevant features during maintenance and prioritization of action plans. For each trial, participants prepared two independent rule-based actions simultaneously, then they were retro-cued to select one as their response. Prior to the start of the trial, one rule-based action was randomly assigned to be high priority by cueing that it was more likely to be tested. We found that both full action plans were maintained as conjunctive representations during action preparation, regardless of priority. However, during output selection, the conjunctive representation of the high priority action plan was more enhanced and readily selected as an output. Further, the strength of the high priority conjunctive representation was associated with behavioral interference when the low priority action was tested. Thus, multiple alternate upcoming actions were maintained as integrated representations and served as the target of post-encoding attentional selection mechanisms to prioritize and select an action from within working memory.more » « less
-
People can use abstract rules to flexibly configure and select actions for specific situations, yet how exactly rules shape actions toward specific sensory and/or motor requirements remains unclear. Both research from animal models and human-level theories of action control point to the role of highly integrated, conjunctive representations, sometimes referred to as event files. These representations are thought to combine rules with other, goal-relevant sensory and motor features in a nonlinear manner and represent a necessary condition for action selection. However, so far, no methods exist to track such representations in humans during action selection with adequate temporal resolution. Here, we applied time-resolved representational similarity analysis to the spectral-temporal profiles of electroencephalography signals while participants performed a cued, rule-based action selection task. In two experiments, we found that conjunctive representations were active throughout the entire selection period and were functionally dissociable from the representation of constituent features. Specifically, the strength of conjunctions was a highly robust predictor of trial-by-trial variability in response times and was selectively related to an important behavioral indicator of conjunctive representations, the so-called partial-overlap priming pattern. These results provide direct evidence for conjunctive representations as critical precursors of action selection in humans.more » « less
-
Abstract Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.more » « less
-
Abstract Investigations into how individual neurons encode behavioral variables of interest have revealed specific representations in single neurons, such as place and object cells, as well as a wide range of cells with conjunctive encodings or mixed selectivity. However, as most experiments examine neural activity within individual tasks, it is currently unclear if and how neural representations change across different task contexts. Within this discussion, the medial temporal lobe is particularly salient, as it is known to be important for multiple behaviors including spatial navigation and memory, however the relationship between these functions is currently unclear. Here, to investigate how representations in single neurons vary across different task contexts in the medial temporal lobe, we collected and analyzed single‐neuron activity from human participants as they completed a paired‐task session consisting of a passive‐viewing visual working memory and a spatial navigation and memory task. Five patients contributed 22 paired‐task sessions, which were spike sorted together to allow for the same putative single neurons to be compared between the different tasks. Within each task, we replicated concept‐related activations in the working memory task, as well as target‐location and serial‐position responsive cells in the navigation task. When comparing neuronal activity between tasks, we first established that a significant number of neurons maintained the same kind of representation, responding to stimuli presentations across tasks. Further, we found cells that changed the nature of their representation across tasks, including a significant number of cells that were stimulus responsive in the working memory task that responded to serial position in the spatial task. Overall, our results support a flexible encoding of multiple, distinct aspects of different tasks by single neurons in the human medial temporal lobe, whereby some individual neurons change the nature of their feature coding between task contexts.more » « less
An official website of the United States government

