Spatial perception of our hand is closely linked to our ability to move the hand accurately. We might therefore expect that reach planning would take into account any changes in perceived hand position; in other words, that perception and action relating to the hand should depend on a common sensorimotor map. However, there is evidence to suggest that changes in perceived hand position affect a body representation that functions separately from the body representation used to control movement. Here, we examined target-directed reaching before and after participants either did (Mismatch group) or did not (Veridical group) experience a cue conflict known to elicit recalibration in perceived hand position. For the reaching task, participants grasped a robotic manipulandum that positioned their unseen hand for each trial. Participants then briskly moved the handle straight ahead to a visual target, receiving no performance feedback. For the perceptual calibration task, participants estimated the locations of visual, proprioceptive, or combined cues about their unseen hand. The Mismatch group experienced a gradual 70-mm forward mismatch between visual and proprioceptive cues, resulting in forward proprioceptive recalibration. Participants made significantly shorter reaches after this manipulation, consistent with feeling their hand to be further forward than it was, but reaching performance returned to baseline levels after only 10 reaches. The Veridical group, after exposure to veridically aligned visual and proprioceptive cues about the hand, showed no change in reach distance. These results suggest that perceptual recalibration affects the same sensorimotor map that is used to plan target-directed reaches. NEW & NOTEWORTHY If perceived hand position changes, we might assume this affects the sensorimotor map and, in turn, reaches made with that hand. However, there is evidence for separate body representations involved in perception versus action. After a cross-sensory conflict that results in proprioceptive recalibration in the forward direction, participants made shorter reaches as predicted, but only briefly. This suggests perceptual recalibration does affect the sensorimotor map used to plan reaches, but the interaction may be short-lived.
more »
« less
Look before you reach: Fixation‐reach latencies predict reaching kinematics in toddlers
Abstract Research on infant and toddler reaching has shown evidence for motor planning after the initiation of the reaching action. However, the reach action sequence does not begin after the initiation of a reach but rather includes the initial visual fixations onto the target object occurring before the reach. We developed a paradigm that synchronizes head‐mounted eye‐tracking and motion capture to determine whether the latency between the first visual fixation on a target object and the first reaching movement toward the object predicts subsequent reaching behavior in toddlers. In a corpus of over one hundred reach sequences produced by 17 toddlers, we found that longer fixation‐reach latencies during the pre‐reach phase predicted slower reaches. If the slowness of an executed reach indicates reach difficulty, then the duration of pre‐reach planning would be correlated with reach difficulty. However, no relation was found with pre‐reach planning duration when reach difficulty was measured by usual factors and independent of reach duration. The findings raise important questions about the measurement of reach difficulty, models of motor control, and possible developmental changes in the relations between pre‐planning and continuously unfolding motor plans throughout an action sequence.
more »
« less
- Award ID(s):
- 1842817
- PAR ID:
- 10481879
- Publisher / Repository:
- Wiley-Blackwell
- Date Published:
- Journal Name:
- Infancy
- Volume:
- 29
- Issue:
- 1
- ISSN:
- 1525-0008
- Format(s):
- Medium: X Size: p. 6-21
- Size(s):
- p. 6-21
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Technological advancements and increased access have prompted the adoption of head- mounted display based virtual reality (VR) for neuroscientific research, manual skill training, and neurological rehabilitation. Applications that focus on manual interaction within the virtual environment (VE), especially haptic-free VR, critically depend on virtual hand-object collision detection. Knowledge about how multisensory integration related to hand-object collisions affects perception-action dynamics and reach-to-grasp coordination is needed to enhance the immersiveness of interactive VR. Here, we explored whether and to what extent sensory substitution for haptic feedback of hand-object collision (visual, audio, or audiovisual) and collider size (size of spherical pointers representing the fingertips) influences reach-to-grasp kinematics. In Study 1, visual, auditory, or combined feedback were compared as sensory substitutes to indicate the successful grasp of a virtual object during reach-to-grasp actions. In Study 2, participants reached to grasp virtual objects using spherical colliders of different diameters to test if virtual collider size impacts reach-to-grasp. Our data indicate that collider size but not sensory feedback modality significantly affected the kinematics of grasping. Larger colliders led to a smaller size-normalized peak aperture. We discuss this finding in the context of a possible influence of spherical collider size on the perception of the virtual object’s size and hence effects on motor planning of reach-to-grasp. Critically, reach-to-grasp spatiotemporal coordination patterns were robust to manipulations of sensory feedback modality and spherical collider size, suggesting that the nervous system adjusted the reach (transport) component commensurately to the changes in the grasp (aperture) component. These results have important implications for research, commercial, industrial, and clinical applications of VR.more » « less
-
Many goal-directed actions that require rapid visuomotor planning and perceptual decision-making are affected in older adults, causing difficulties in execution of many functional activities of daily living. Visuomotor planning and perceptual identification are mediated by the dorsal and ventral visual streams, respectively, but it is unclear how age-induced changes in sensory processing in these streams contribute to declines in visuomotor decision-making performance. Previously, we showed that in young adults, task demands influenced movement strategies during visuomotor decision-making, reflecting differential integration of sensory information between the two streams. Here, we asked the question if older adults would exhibit deficits in interactions between the two streams during demanding motor tasks. Older adults ( n = 15) and young controls ( n = 26) performed reaching or interception movements toward virtual objects. In some blocks of trials, participants also had to select an appropriate movement goal based on the shape of the object. Our results showed that older adults corrected fewer initial decision errors during both reaching and interception movements. During the interception decision task, older adults made more decision- and execution-related errors than young adults, which were related to early initiation of their movements. Together, these results suggest that older adults have a reduced ability to integrate new perceptual information to guide online action, which may reflect impaired ventral-dorsal stream interactions. NEW & NOTEWORTHY Older adults show declines in vision, decision-making, and motor control, which can lead to functional limitations. We used a rapid visuomotor decision task to examine how these deficits may interact to affect task performance. Compared with healthy young adults, older adults made more errors in both decision-making and motor execution, especially when the task required intercepting moving targets. This suggests that age-related declines in integrating perceptual and motor information may contribute to functional deficits.more » « less
-
Abstract Unconscious neural activity has been shown to precede both motor and cognitive acts. In the present study, we investigated the neural antecedents of overt attention during visual search, where subjects make voluntary saccadic eye movements to search a cluttered stimulus array for a target item. Building on studies of both overt self-generated motor actions (Lau et al., 2004, Soon et al., 2008) and self-generated cognitive actions (Bengson et al., 2014, Soon et al., 2013), we hypothesized that brain activity prior to the onset of a search array would predict the direction of the first saccade during unguided visual search. Because both spatial attention and gaze are coordinated during visual search, both cognition and motor actions are coupled during visual search. A well-established finding in fMRI studies of willed action is that neural antecedents of the intention to make a motor act (e.g., reaching) can be identified seconds before the action occurs. Studies of the volitional control ofcovertspatial attention in EEG have shown that predictive brain activity is limited to only a few hundred milliseconds before a voluntary shift of covert spatial attention. In the present study, the visual search task and stimuli were designed so that subjects could not predict the onset of the search array. Perceptual task difficulty was high, such that they could not locate the target using covert attention alone, thus requiring overt shifts of attention (saccades) to carry out the visual search. If the first saccade to the array onset in unguided visual search shares mechanisms with willed shifts of covert attention, we expected predictive EEG alpha-band activity (8-12 Hz) immediately prior to the array onset (within 1 sec) (Bengson et al., 2014; Nadra et al., 2023). Alternatively, if they follow the principles of willed motor actions, predictive neural signals should be reflected in broadband EEG activity (Libet et al., 1983) and would likely emerge earlier (Soon et al., 2008). Applying support vector machine decoding, we found that the direction of the first saccade in an unguided visual search could be predicted up to two seconds preceding the search array’s onset in the broadband but not alpha-band EEG. These findings suggest that self-directed eye movements in visual search emerge from early preparatory neural activity more akin to willed motor actions than to covert willed attention. This highlights a distinct role for unconscious neural dynamics in shaping visual search behavior.more » « less
-
Isometric force generation and kinematic reaching in the upper extremity has been found to be represented by a limited number of muscle synergies, even across task-specific variations. However, the extent of the generalizability of muscle synergies between these two motor tasks within the arm workspace remains unknown. In this study, we recorded electromyographic (EMG) signals from 13 different arm, shoulder, and back muscles of ten healthy individuals while they performed isometric and kinematic center-out target matches to one of 12 equidistant directional targets in the horizontal plane and at each of four starting arm positions. Non-negative matrix factorization was applied to the EMG data to identify the muscle synergies. Five and six muscle synergies were found to represent the isometric force generation and point-to-point reaches. We also found that the number and composition of muscle synergies were conserved across the arm workspace per motor task. Similar tuning directions of muscle synergy activation profiles were observed at different starting arm locations. Between the isometric and kinematic motor tasks, we found that two to four out of five muscle synergies were common in the composition and activation profiles across the starting arm locations. The greater number of muscle synergies that were involved in achieving a target match in the reaching task compared to the isometric task may explain the complexity of neuromotor control in arm reaching movements. Overall, our results may provide further insight into the neuromotor compartmentalization of shared muscle synergies between two different arm motor tasks and can be utilized to assess motor disabilities in individuals with upper limb motor impairments.more » « less
An official website of the United States government
