The brain estimates hand position using vision and position sense (proprioception). The relationship between visual and proprioceptive estimates is somewhat flexible: visual information about the index finger can be spatially displaced from proprioceptive information, resulting in cross-sensory recalibration of the visual and proprioceptive unimodal position estimates. According to the causal inference framework, recalibration occurs when the unimodal estimates are attributed to a common cause and integrated. If separate causes are perceived, then recalibration should be reduced. Here we assessed visuo-proprioceptive recalibration in response to a gradual visuo-proprioceptive mismatch at the left index fingertip. Experiment 1 asked how frequently a 70 mm mismatch is consciously perceived compared to when no mismatch is present, and whether awareness is linked to reduced visuo-proprioceptive recalibration, consistent with causal inference predictions. However, conscious offset awareness occurred rarely. Experiment 2 tested a larger displacement, 140 mm, and asked participants about their perception more frequently, including at 70 mm. Experiment 3 confirmed that participants were unbiased at estimating distances in the 2D virtual reality display. Results suggest that conscious awareness of the mismatch was indeed linked to reduced cross-sensory recalibration as predicted by the causal inference framework, but this was clear only at higher mismatch magnitudes (70–140 mm). At smaller offsets (up to 70 mm), conscious perception of an offset may not override unconscious belief in a common cause, perhaps because the perceived offset magnitude is in range of participants’ natural sensory biases. These findings highlight the interaction of conscious awareness with multisensory processes in hand perception.
more »
« less
Visuo-proprioceptive recalibration and the sensorimotor map
Spatial perception of our hand is closely linked to our ability to move the hand accurately. We might therefore expect that reach planning would take into account any changes in perceived hand position; in other words, that perception and action relating to the hand should depend on a common sensorimotor map. However, there is evidence to suggest that changes in perceived hand position affect a body representation that functions separately from the body representation used to control movement. Here, we examined target-directed reaching before and after participants either did (Mismatch group) or did not (Veridical group) experience a cue conflict known to elicit recalibration in perceived hand position. For the reaching task, participants grasped a robotic manipulandum that positioned their unseen hand for each trial. Participants then briskly moved the handle straight ahead to a visual target, receiving no performance feedback. For the perceptual calibration task, participants estimated the locations of visual, proprioceptive, or combined cues about their unseen hand. The Mismatch group experienced a gradual 70-mm forward mismatch between visual and proprioceptive cues, resulting in forward proprioceptive recalibration. Participants made significantly shorter reaches after this manipulation, consistent with feeling their hand to be further forward than it was, but reaching performance returned to baseline levels after only 10 reaches. The Veridical group, after exposure to veridically aligned visual and proprioceptive cues about the hand, showed no change in reach distance. These results suggest that perceptual recalibration affects the same sensorimotor map that is used to plan target-directed reaches. NEW & NOTEWORTHY If perceived hand position changes, we might assume this affects the sensorimotor map and, in turn, reaches made with that hand. However, there is evidence for separate body representations involved in perception versus action. After a cross-sensory conflict that results in proprioceptive recalibration in the forward direction, participants made shorter reaches as predicted, but only briefly. This suggests perceptual recalibration does affect the sensorimotor map used to plan reaches, but the interaction may be short-lived.
more »
« less
- Award ID(s):
- 1753915
- PAR ID:
- 10429247
- Date Published:
- Journal Name:
- Journal of Neurophysiology
- Volume:
- 129
- Issue:
- 5
- ISSN:
- 0022-3077
- Page Range / eLocation ID:
- 1249 to 1258
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract When visual and proprioceptive estimates of hand position disagree (e.g., viewing the hand underwater), the brain realigns them to reduce mismatch. This perceptual change is reflected in primary motor cortex (M1) excitability, suggesting potential relevance for hand movement. Here, we asked whether fingertip visuo-proprioceptive misalignment affects only the brain’s representation of that finger (somatotopically focal), or extends to other parts of the limb that would be needed to move the misaligned finger (somatotopically broad). In Experiments 1 and 2, before and after misaligned or veridical visuo-proprioceptive training at the index finger, we used transcranial magnetic stimulation to assess M1 representation of five hand and arm muscles. The index finger representation showed an association between M1 excitability and visuo-proprioceptive realignment, as did the pinkie finger representation to a lesser extent. Forearm flexors, forearm extensors, and biceps did not show any such relationship. In Experiment 3, participants indicated their proprioceptive estimate of the fingertip, knuckle, wrist, and elbow, before and after misalignment at the fingertip. Proprioceptive realignment at the knuckle, but not the wrist or elbow, was correlated with realignment at the fingertip. These results suggest the effects of visuo-proprioceptive mismatch are somatotopically focal in both sensory and motor domains.more » « less
-
Motor learning in visuomotor adaptation tasks results from both explicit and implicit processes, each responding differently to an error signal. Although the motor output side of these processes has been extensively studied, the visual input side is relatively unknown. We investigated if and how depth perception affects the computation of error information by explicit and implicit motor learning. Two groups of participants made reaching movements to bring a virtual cursor to a target in the frontoparallel plane. The Delayed group was allowed to reaim and their feedback was delayed to emphasize explicit learning, whereas the camped group received task-irrelevant clamped cursor feedback and continued to aim straight at the target to emphasize implicit adaptation. Both groups played this game in a highly detailed virtual environment (depth condition), leveraging a cover task of playing darts in a virtual tavern, and in an empty environment (no-depth condition). The delayed group showed an increase in error sensitivity under depth relative to no-depth. In contrast, the clamped group adapted to the same degree under both conditions. The movement kinematics of the delayed participants also changed under the depth condition, consistent with the target appearing more distant, unlike the Clamped group. A comparison of the delayed behavioral data with a perceptual task from the same individuals showed that the greater reaiming in the depth condition was consistent with an increase in the scaling of the error distance and size. These findings suggest that explicit and implicit learning processes may rely on different sources of perceptual information. NEW & NOTEWORTHY We leveraged a classic sensorimotor adaptation task to perform a first systematic assessment of the role of perceptual cues in the estimation of an error signal in the 3-D space during motor learning. We crossed two conditions presenting different amounts of depth information, with two manipulations emphasizing explicit and implicit learning processes. Explicit learning responded to the visual conditions, consistent with perceptual reports, whereas implicit learning appeared to be independent of them.more » « less
-
Abstract Hand position can be estimated by vision and proprioception (position sense). The brain is thought to weight and integrate these percepts to form a multisensory estimate of hand position with which to guide movement. Force field adaptation, a type of cerebellum-dependent motor learning, is associated with both motor and proprioceptive changes. The cerebellum has connections with multisensory parietal regions; however, it is unknown if force adaptation is associated with changes in multisensory perception. If force adaptation affects all relevant sensory modalities similarly, the brain’s weighting of vision vs. proprioception should be maintained. Alternatively, if force perturbation is interpreted as somatosensory unreliability, vision may be up-weighted relative to proprioception. We assessed visuo-proprioceptive weighting with a perceptual estimation task before and after subjects performed straight-ahead reaches grasping a robotic manipulandum. Each subject performed one session with a clockwise or counter-clockwise velocity-dependent force field, and one session in a null field. Subjects increased their weight of vision vs. proprioception in the force field session relative to the null session, regardless of force field direction, in the straight-ahead dimension (F1,44 = 5.13, p = 0.029). This suggests that force field adaptation is associated with an increase in the brain’s weighting of vision vs. proprioception.more » « less
-
How veridical is perception? Rather than representing objects as they actually exist in the world, might perception instead represent objects only in terms of the utility they offer to an observer? Previous work employed evolutionary modeling to show that under certain assumptions, natural selection favors such “strict‐interface” perceptual systems. This view has fueled considerable debate, but we think that discussions so far have failed to consider the implications of two critical aspects of perception. First, while existing models have explored single utility functions, perception will often serve multiple largely independent goals. (Sometimes when looking at a stick you want to know how appropriate it would be as kindling for a campfire, and other times you want to know how appropriate it would be as a weapon for self‐defense.) Second, perception often operates in an inflexible, automatic manner—proving “impenetrable” to shifting higher‐level goals. (When your goal shifts from “burning” to “fighting,” your visual experience does not dramatically transform.) These two points have important implications for the veridicality of perception. In particular, as the need for flexible goals increases, inflexible perceptual systems must become more veridical. We support this position by providing evidence from evolutionary simulations that as the number of independent utility functions increases, the distinction between “interface” and “veridical” perceptual systems dissolves. Although natural selection evaluates perceptual systems only on their fitness, the most fit perceptual systems may nevertheless represent the world as it is.more » « less