skip to main content


Title: Inter-personal motor interaction is facilitated by hand pairing
Abstract The extent to which hand dominance may influence how each agent contributes to inter-personal coordination remains unknown. In the present study, right-handed human participants performed object balancing tasks either in dyadic conditions with each agent using one hand (left or right), or in bimanual conditions where each agent performed the task individually with both hands. We found that object load was shared between two hands more asymmetrically in dyadic than single-agent conditions. However, hand dominance did not influence how two hands shared the object load. In contrast, hand dominance was a major factor in modulating hand vertical movement speed. Furthermore, the magnitude of internal force produced by two hands against each other correlated with the synchrony between the two hands’ movement in dyads. This finding supports the important role of internal force in haptic communication. Importantly, both internal force and movement synchrony were affected by hand dominance of the paired participants. Overall, these results demonstrate, for the first time, that pairing of one dominant and one non-dominant hand may promote asymmetrical roles within a dyad during joint physical interactions. This appears to enable the agent using the dominant hand to actively maintain effective haptic communication and task performance.  more » « less
Award ID(s):
1827752 1827725
NSF-PAR ID:
10359239
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Scientific Reports
Volume:
12
Issue:
1
ISSN:
2045-2322
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Principles from human-human physical interaction may be necessary to design more intuitive and seamless robotic devices to aid human movement. Previous studies have shown that light touch can aid balance and that haptic communication can improve performance of physical tasks, but the effects of touch between two humans on walking balance has not been previously characterized. This study examines physical interaction between two persons when one person aids another in performing a beam-walking task. 12 pairs of healthy young adults held a force sensor with one hand while one person walked on a narrow balance beam (2 cm wide x 3.7 m long) and the other person walked overground by their side. We compare balance performance during partnered vs. solo beam-walking to examine the effects of haptic interaction, and we compare hand interaction mechanics during partnered beam-walking vs. overground walking to examine how the interaction aided balance. While holding the hand of a partner, participants were able to walk further on the beam without falling, reduce lateral sway, and decrease angular momentum in the frontal plane. We measured small hand force magnitudes (mean of 2.2 N laterally and 3.4 N vertically) that created opposing torque components about the beam axis and calculated the interaction torque, the overlapping opposing torque that does not contribute to motion of the beam-walker’s body. We found higher interaction torque magnitudes during partnered beam-walking vs . partnered overground walking, and correlation between interaction torque magnitude and reductions in lateral sway. To gain insight into feasible controller designs to emulate human-human physical interactions for aiding walking balance, we modeled the relationship between each torque component and motion of the beam-walker’s body as a mass-spring-damper system. Our model results show opposite types of mechanical elements (active vs . passive) for the two torque components. Our results demonstrate that hand interactions aid balance during partnered beam-walking by creating opposing torques that primarily serve haptic communication, and our model of the torques suggest control parameters for implementing human-human balance aid in human-robot interactions. 
    more » « less
  2. Tactile sensing has been increasingly utilized in robot control of unknown objects to infer physical properties and optimize manipulation. However, there is limited understanding about the contribution of different sensory modalities during interactive perception in complex interaction both in robots and in humans. This study investigated the effect of visual and haptic information on humans’ exploratory interactions with a ‘cup of coffee’, an object with nonlinear internal dynamics. Subjects were instructed to rhythmically transport a virtual cup with a rolling ball inside between two targets at a specified frequency, using a robotic interface. The cup and targets were displayed on a screen, and force feedback from the cup-andball dynamics was provided via the robotic manipulandum. Subjects were encouraged to explore and prepare the dynamics by “shaking” the cup-and-ball system to find the best initial conditions prior to the task. Two groups of subjects received the full haptic feedback about the cup-and-ball movement during the task; however, for one group the ball movement was visually occluded. Visual information about the ball movement had two distinctive effects on the performance: it reduced preparation time needed to understand the dynamics and, importantly, it led to simpler, more linear input-output interactions between hand and object. The results highlight how visual and haptic information regarding nonlinear internal dynamics have distinct roles for the interactive perception of complex objects. 
    more » « less
  3. null (Ed.)
    Many daily tasks involve the collaboration of both hands. Humans dexterously adjust hand poses and modulate the forces exerted by fingers in response to task demands. Hand pose selection has been intensively studied in unimanual tasks, but little work has investigated bimanual tasks. This work examines hand poses selection in a bimanual high-precision-screwing task taken from watchmaking. Twenty right-handed subjects dismounted a screw on the watch face with a screwdriver in two conditions. Results showed that although subjects used similar hand poses across steps within the same experimental conditions, the hand poses differed significantly in the two conditions. In the free-base condition, subjects needed to stabilize the watch face on the table. The role distribution across hands was strongly influenced by hand dominance: the dominant hand manipulated the tool, whereas the nondominant hand controlled the additional degrees of freedom that might impair performance. In contrast, in the fixed-base condition, the watch face was stationary. Subjects used both hands even though single hand would have been sufficient. Importantly, hand poses decoupled the control of task-demanded force and torque across hands through virtual fingers that grouped multiple fingers into functional units. This preference for bimanual over unimanual control strategy could be an effort to reduce variability caused by mechanical couplings and to alleviate intrinsic sensorimotor processing burdens. To afford analysis of this variety of observations, a novel graphical matrix-based representation of the distribution of hand pose combinations was developed. Atypical hand poses that are not documented in extant hand taxonomies are also included. NEW & NOTEWORTHY We study hand poses selection in bimanual fine motor skills. To understand how roles and control variables are distributed across the hands and fingers, we compared two conditions when unscrewing a screw from a watch face. When the watch face needed positioning, role distribution was strongly influenced by hand dominance; when the watch face was stationary, a variety of hand pose combinations emerged. Control of independent task demands is distributed either across hands or across distinct groups of fingers. 
    more » « less
  4. Reaching movements performed from a crouched body posture require a shift of body weight from both arms to one arm. This situation has remained unexamined despite the analogous load requirements during step initiation and the many studies of reaching from a seated or standing posture. To determine whether the body weight shift involves anticipatory or exclusively reactive control, we obtained force plate records, hand kinematics, and arm muscle activity from 11 healthy right-handed participants. They performed reaching movements with their left and right arm in two speed contexts, “comfortable” and “as fast as possible,” and two postural contexts, a less stable knees-together posture and a more stable knees-apart posture. Weight-shifts involved anticipatory postural actions (APAs) by the reaching and stance arms that were opposing in the vertical axis and aligned in the side-to-side axis similar to APAs by the legs for step initiation. Weight-shift APAs were correlated in time and magnitude, present in both speed contexts, more vigorous with the knees placed together, and similar when reaching with the dominant and nondominant arm. The initial weight-shift was preceded by bursts of muscle activity in the shoulder and elbow extensors (posterior deltoid and triceps lateral) of the reach arm and shoulder flexor (pectoralis major) of the stance arm, which indicates their causal role; leg muscles may have indirectly contributed but were not recorded. The strong functional similarity of weight-shift APAs during crouched reaching to human stepping and cat reaching suggests that they are a core feature of posture-movement coordination. NEW & NOTEWORTHY This work demonstrates that reaching from a crouched posture is preceded by bimanual anticipatory postural adjustments (APAs) that shift the body weight to the stance limb. Weight-shift APAs are more robust in an unstable body posture (knees together) and involve the shoulder and elbow extensors of the reach arm and shoulder flexor of the stance arm. This pattern mirrors the forelimb coordination of cats reaching and humans initiating a step. 
    more » « less
  5. null (Ed.)
    An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulled apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery. 
    more » « less