skip to main content


Title: The role of conjunctive representations in prioritizing and selecting planned actions
For flexible goal-directed behavior, prioritizing and selecting a specific action among multiple candidates is often important. Working memory has long been assumed to play a role in prioritization and planning, while bridging cross-temporal contingencies during action selection. However, studies of working memory have mostly focused on memory for single components of an action plan, such as a rule or a stimulus, rather than management of all of these elements during planning. Therefore, it is not known how post-encoding prioritization and selection operate on the entire profile of representations for prospective actions. Here, we assessed how such control processes unfold over action representations, highlighting the role of conjunctive representations that nonlinearly integrate task-relevant features during maintenance and prioritization of action plans. For each trial, participants prepared two independent rule-based actions simultaneously, then they were retro-cued to select one as their response. Prior to the start of the trial, one rule-based action was randomly assigned to be high priority by cueing that it was more likely to be tested. We found that both full action plans were maintained as conjunctive representations during action preparation, regardless of priority. However, during output selection, the conjunctive representation of the high priority action plan was more enhanced and readily selected as an output. Further, the strength of the high priority conjunctive representation was associated with behavioral interference when the low priority action was tested. Thus, multiple alternate upcoming actions were maintained as integrated representations and served as the target of post-encoding attentional selection mechanisms to prioritize and select an action from within working memory.  more » « less
Award ID(s):
2120712
NSF-PAR ID:
10378447
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
eLife
Volume:
11
ISSN:
2050-084X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. People can use abstract rules to flexibly configure and select actions for specific situations, yet how exactly rules shape actions toward specific sensory and/or motor requirements remains unclear. Both research from animal models and human-level theories of action control point to the role of highly integrated, conjunctive representations, sometimes referred to as event files. These representations are thought to combine rules with other, goal-relevant sensory and motor features in a nonlinear manner and represent a necessary condition for action selection. However, so far, no methods exist to track such representations in humans during action selection with adequate temporal resolution. Here, we applied time-resolved representational similarity analysis to the spectral-temporal profiles of electroencephalography signals while participants performed a cued, rule-based action selection task. In two experiments, we found that conjunctive representations were active throughout the entire selection period and were functionally dissociable from the representation of constituent features. Specifically, the strength of conjunctions was a highly robust predictor of trial-by-trial variability in response times and was selectively related to an important behavioral indicator of conjunctive representations, the so-called partial-overlap priming pattern. These results provide direct evidence for conjunctive representations as critical precursors of action selection in humans. 
    more » « less
  2. Action selection appears to rely on conjunctive representations that nonlinearly integrate task-relevant features. Here, we tested a corollary of this hypothesis: that such representations are also intricately involved during attempts to stop an action—a key aspect of action regulation. We tracked both conjunctive representations and those of constituent rule, stimulus, or response features through trial-by-trial representational similarity analysis of the electroencephalogram signal in a combined rule-selection and stop-signal paradigm. Across two experiments with student participants ( N = 57), we found (a) that the strength of decoded conjunctive representations prior to the stop signal uniquely predicted trial-by-trial stopping success (Experiment 1) and (b) that these representations were selectively suppressed following the onset of the stop signal (Experiments 1 and 2). We conclude that conjunctive representations are key to successful action execution and therefore need to be suppressed when an intended action is no longer appropriate.

     
    more » « less
  3. null (Ed.)
    Action selection appears to rely on conjunctive representations that nonlinearly integrate task- relevant features (Kikumoto & Mayr, 2020). We test here the flip-side of this hypothesis that such representations are also intricately involved during attempts to stop an action––a key aspect of action regulation. We tracked both conjunctive representations and those of constituent rule, stimulus, or response features through trial-by-trial representational similarity analysis of the EEG signal in a combined, rule-selection and stop-signal paradigm. Across two experiments with student participants (N = 57), we found (a) that the strength of decoded conjunctive representations prior to the stop-signal uniquely predicted trial-by-trial stopping success (Exp. 1) and (b) that these representations were selectively suppressed following the onset of the stop-signal (Exp. 1 and 2). We conclude that conjunctive representations are key to successful action execution and therefore need to be suppressed when an intended action is no longer appropriate. 
    more » « less
  4. Abstract

    Investigations into how individual neurons encode behavioral variables of interest have revealed specific representations in single neurons, such as place and object cells, as well as a wide range of cells with conjunctive encodings or mixed selectivity. However, as most experiments examine neural activity within individual tasks, it is currently unclear if and how neural representations change across different task contexts. Within this discussion, the medial temporal lobe is particularly salient, as it is known to be important for multiple behaviors including spatial navigation and memory, however the relationship between these functions is currently unclear. Here, to investigate how representations in single neurons vary across different task contexts in the medial temporal lobe, we collected and analyzed single‐neuron activity from human participants as they completed a paired‐task session consisting of a passive‐viewing visual working memory and a spatial navigation and memory task. Five patients contributed 22 paired‐task sessions, which were spike sorted together to allow for the same putative single neurons to be compared between the different tasks. Within each task, we replicated concept‐related activations in the working memory task, as well as target‐location and serial‐position responsive cells in the navigation task. When comparing neuronal activity between tasks, we first established that a significant number of neurons maintained the same kind of representation, responding to stimuli presentations across tasks. Further, we found cells that changed the nature of their representation across tasks, including a significant number of cells that were stimulus responsive in the working memory task that responded to serial position in the spatial task. Overall, our results support a flexible encoding of multiple, distinct aspects of different tasks by single neurons in the human medial temporal lobe, whereby some individual neurons change the nature of their feature coding between task contexts.

     
    more » « less
  5. Abstract

    Previous work shows that automatic attention biases toward recently selected target features transfer across action and perception and even across different effectors such as the eyes and hands on a trial-by-trial basis. Although these findings suggest a common neural representation of selection history across effectors, the extent to which information about recently selected target features is encoded in overlapping versus distinct brain regions is unknown. Using fMRI and a priming of pop-out task where participants selected unpredictable, uniquely colored targets among homogeneous distractors via reach or saccade, we show that color priming is driven by shared, effector-independent underlying representations of recent selection history. Consistent with previous work, we found that the intraparietal sulcus (IPS) was commonly activated on trials where target colors were switched relative to those where the colors were repeated; however, the dorsal anterior insula exhibited effector-specific activation related to color priming. Via multivoxel cross-classification analyses, we further demonstrate that fine-grained patterns of activity in both IPS and the medial temporal lobe encode information about selection history in an effector-independent manner, such that ROI-specific models trained on activity patterns during reach selection could predict whether a color was repeated or switched on the current trial during saccade selection and vice versa. Remarkably, model generalization performance in IPS and medial temporal lobe also tracked individual differences in behavioral priming sensitivity across both types of action. These results represent a first step to clarify the neural substrates of experience-driven selection biases in contexts that require the coordination of multiple actions.

     
    more » « less