skip to main content


Title: Conscious awareness of a visuo-proprioceptive mismatch: Effect on cross-sensory recalibration
The brain estimates hand position using vision and position sense (proprioception). The relationship between visual and proprioceptive estimates is somewhat flexible: visual information about the index finger can be spatially displaced from proprioceptive information, resulting in cross-sensory recalibration of the visual and proprioceptive unimodal position estimates. According to the causal inference framework, recalibration occurs when the unimodal estimates are attributed to a common cause and integrated. If separate causes are perceived, then recalibration should be reduced. Here we assessed visuo-proprioceptive recalibration in response to a gradual visuo-proprioceptive mismatch at the left index fingertip. Experiment 1 asked how frequently a 70 mm mismatch is consciously perceived compared to when no mismatch is present, and whether awareness is linked to reduced visuo-proprioceptive recalibration, consistent with causal inference predictions. However, conscious offset awareness occurred rarely. Experiment 2 tested a larger displacement, 140 mm, and asked participants about their perception more frequently, including at 70 mm. Experiment 3 confirmed that participants were unbiased at estimating distances in the 2D virtual reality display. Results suggest that conscious awareness of the mismatch was indeed linked to reduced cross-sensory recalibration as predicted by the causal inference framework, but this was clear only at higher mismatch magnitudes (70–140 mm). At smaller offsets (up to 70 mm), conscious perception of an offset may not override unconscious belief in a common cause, perhaps because the perceived offset magnitude is in range of participants’ natural sensory biases. These findings highlight the interaction of conscious awareness with multisensory processes in hand perception.  more » « less
Award ID(s):
1753915
NSF-PAR ID:
10429243
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Frontiers in Neuroscience
Volume:
16
ISSN:
1662-453X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Spatial perception of our hand is closely linked to our ability to move the hand accurately. We might therefore expect that reach planning would take into account any changes in perceived hand position; in other words, that perception and action relating to the hand should depend on a common sensorimotor map. However, there is evidence to suggest that changes in perceived hand position affect a body representation that functions separately from the body representation used to control movement. Here, we examined target-directed reaching before and after participants either did (Mismatch group) or did not (Veridical group) experience a cue conflict known to elicit recalibration in perceived hand position. For the reaching task, participants grasped a robotic manipulandum that positioned their unseen hand for each trial. Participants then briskly moved the handle straight ahead to a visual target, receiving no performance feedback. For the perceptual calibration task, participants estimated the locations of visual, proprioceptive, or combined cues about their unseen hand. The Mismatch group experienced a gradual 70-mm forward mismatch between visual and proprioceptive cues, resulting in forward proprioceptive recalibration. Participants made significantly shorter reaches after this manipulation, consistent with feeling their hand to be further forward than it was, but reaching performance returned to baseline levels after only 10 reaches. The Veridical group, after exposure to veridically aligned visual and proprioceptive cues about the hand, showed no change in reach distance. These results suggest that perceptual recalibration affects the same sensorimotor map that is used to plan target-directed reaches. NEW & NOTEWORTHY If perceived hand position changes, we might assume this affects the sensorimotor map and, in turn, reaches made with that hand. However, there is evidence for separate body representations involved in perception versus action. After a cross-sensory conflict that results in proprioceptive recalibration in the forward direction, participants made shorter reaches as predicted, but only briefly. This suggests perceptual recalibration does affect the sensorimotor map used to plan reaches, but the interaction may be short-lived. 
    more » « less
  2. Abstract When visual and proprioceptive estimates of hand position disagree (e.g., viewing the hand underwater), the brain realigns them to reduce mismatch. This perceptual change is reflected in primary motor cortex (M1) excitability, suggesting potential relevance for hand movement. Here, we asked whether fingertip visuo-proprioceptive misalignment affects only the brain’s representation of that finger (somatotopically focal), or extends to other parts of the limb that would be needed to move the misaligned finger (somatotopically broad). In Experiments 1 and 2, before and after misaligned or veridical visuo-proprioceptive training at the index finger, we used transcranial magnetic stimulation to assess M1 representation of five hand and arm muscles. The index finger representation showed an association between M1 excitability and visuo-proprioceptive realignment, as did the pinkie finger representation to a lesser extent. Forearm flexors, forearm extensors, and biceps did not show any such relationship. In Experiment 3, participants indicated their proprioceptive estimate of the fingertip, knuckle, wrist, and elbow, before and after misalignment at the fingertip. Proprioceptive realignment at the knuckle, but not the wrist or elbow, was correlated with realignment at the fingertip. These results suggest the effects of visuo-proprioceptive mismatch are somatotopically focal in both sensory and motor domains. 
    more » « less
  3. Identifying neural correlates of conscious perception is a fundamental endeavor of cognitive neuroscience. Most studies so far have focused on visual awareness along with trial-by-trial reports of task relevant stimuli, which can confound neural measures of perceptual awareness with post-perceptual processing. Here, we used a three-phase sine-wave speech paradigm that dissociated between conscious speech perception and task relevance while recording EEG in humans of both sexes. Compared to tokens perceived as noise, physically identical sine-wave speech tokens that were perceived as speech elicited a left-lateralized, near-vertex negativity, which we interpret as a phonological version of a perceptual awareness negativity. This response appeared between 200 and 300 ms after token onset and was not present for frequency-flipped control tokens that were never perceived as speech. In contrast, the P3b elicited by task-irrelevant tokens did not significantly differ when the tokens were perceived as speech versus noise, and was only enhanced for tokens that were both perceived as speechandrelevant to the task. Our results extend the findings from previous studies on visual awareness and speech perception, and suggest that correlates of conscious perception, across types of conscious content, are most likely to be found in mid-latency negative-going brain responses in content-specific sensory areas.

    Significance StatementHow patterns of brain activity give rise to conscious perception is a fundamental question of cognitive neuroscience. Here, we asked whether markers of conscious speech perception can be separated from task-related confounds. We combined sine-wave speech - a degraded speech signal that is heard as noise by naive individuals but can readily be heard as speech after minimal training - with a no-report paradigm that independently manipulated perception (speech versus non-speech) and task (relevant versus irrelevant). Using this paradigm, we were able to identify a marker of speech perception in mid-latency responses over left frontotemporal EEG channels that was independent of task. Our results demonstrate that the “perceptual awareness negativity” is present for a new type of perceptual content (speech).

     
    more » « less
  4. Abstract

    Prominent theories suggest that symptoms of schizophrenia stem from learning deficiencies resulting in distorted internal models of the world. To test these theories further, we used a visual statistical learning task known to induce rapid implicit learning of the stimulus statistics. In this task, participants are presented with a field of coherently moving dots and are asked to report the presented direction of the dots (estimation task), and whether they saw any dots or not (detection task). Two of the directions were more frequently presented than the others. In controls, the implicit acquisition of the stimuli statistics influences their perception in two ways: (i) motion directions are perceived as being more similar to the most frequently presented directions than they really are (estimation biases); and (ii) in the absence of stimuli, participants sometimes report perceiving the most frequently presented directions (a form of hallucinations). Such behaviour is consistent with probabilistic inference, i.e. combining learnt perceptual priors with sensory evidence. We investigated whether patients with chronic, stable, treated schizophrenia (n = 20) differ from controls (n = 23) in the acquisition of the perceptual priors and/or their influence on perception. We found that although patients were slower than controls, they showed comparable acquisition of perceptual priors, approximating the stimulus statistics. This suggests that patients have no statistical learning deficits in our task. This may reflect our patients’ relative wellbeing on antipsychotic medication. Intriguingly, however, patients experienced significantly fewer (P = 0.016) hallucinations of the most frequently presented directions than controls when the stimulus was absent or when it was very weak (prior-based lapse estimations). This suggests that prior expectations had less influence on patients’ perception than on controls when stimuli were absent or below perceptual threshold.

     
    more » « less
  5. Abstract

    Hand position can be estimated by vision and proprioception (position sense). The brain is thought to weight and integrate these percepts to form a multisensory estimate of hand position with which to guide movement. Force field adaptation, a type of cerebellum-dependent motor learning, is associated with both motor and proprioceptive changes. The cerebellum has connections with multisensory parietal regions; however, it is unknown if force adaptation is associated with changes in multisensory perception. If force adaptation affects all relevant sensory modalities similarly, the brain’s weighting of vision vs. proprioception should be maintained. Alternatively, if force perturbation is interpreted as somatosensory unreliability, vision may be up-weighted relative to proprioception. We assessed visuo-proprioceptive weighting with a perceptual estimation task before and after subjects performed straight-ahead reaches grasping a robotic manipulandum. Each subject performed one session with a clockwise or counter-clockwise velocity-dependent force field, and one session in a null field. Subjects increased their weight of vision vs. proprioception in the force field session relative to the null session, regardless of force field direction, in the straight-ahead dimension (F1,44 = 5.13, p = 0.029). This suggests that force field adaptation is associated with an increase in the brain’s weighting of vision vs. proprioception.

     
    more » « less