Abstract The extent to which hand dominance may influence how each agent contributes to inter-personal coordination remains unknown. In the present study, right-handed human participants performed object balancing tasks either in dyadic conditions with each agent using one hand (left or right), or in bimanual conditions where each agent performed the task individually with both hands. We found that object load was shared between two hands more asymmetrically in dyadic than single-agent conditions. However, hand dominance did not influence how two hands shared the object load. In contrast, hand dominance was a major factor in modulating hand vertical movement speed. Furthermore, the magnitude of internal force produced by two hands against each other correlated with the synchrony between the two hands’ movement in dyads. This finding supports the important role of internal force in haptic communication. Importantly, both internal force and movement synchrony were affected by hand dominance of the paired participants. Overall, these results demonstrate, for the first time, that pairing of one dominant and one non-dominant hand may promote asymmetrical roles within a dyad during joint physical interactions. This appears to enable the agent using the dominant hand to actively maintain effective haptic communication and task performance.
more »
« less
Hand pose selection in a bimanual fine-manipulation task
Many daily tasks involve the collaboration of both hands. Humans dexterously adjust hand poses and modulate the forces exerted by fingers in response to task demands. Hand pose selection has been intensively studied in unimanual tasks, but little work has investigated bimanual tasks. This work examines hand poses selection in a bimanual high-precision-screwing task taken from watchmaking. Twenty right-handed subjects dismounted a screw on the watch face with a screwdriver in two conditions. Results showed that although subjects used similar hand poses across steps within the same experimental conditions, the hand poses differed significantly in the two conditions. In the free-base condition, subjects needed to stabilize the watch face on the table. The role distribution across hands was strongly influenced by hand dominance: the dominant hand manipulated the tool, whereas the nondominant hand controlled the additional degrees of freedom that might impair performance. In contrast, in the fixed-base condition, the watch face was stationary. Subjects used both hands even though single hand would have been sufficient. Importantly, hand poses decoupled the control of task-demanded force and torque across hands through virtual fingers that grouped multiple fingers into functional units. This preference for bimanual over unimanual control strategy could be an effort to reduce variability caused by mechanical couplings and to alleviate intrinsic sensorimotor processing burdens. To afford analysis of this variety of observations, a novel graphical matrix-based representation of the distribution of hand pose combinations was developed. Atypical hand poses that are not documented in extant hand taxonomies are also included. NEW & NOTEWORTHY We study hand poses selection in bimanual fine motor skills. To understand how roles and control variables are distributed across the hands and fingers, we compared two conditions when unscrewing a screw from a watch face. When the watch face needed positioning, role distribution was strongly influenced by hand dominance; when the watch face was stationary, a variety of hand pose combinations emerged. Control of independent task demands is distributed either across hands or across distinct groups of fingers.
more »
« less
- Award ID(s):
- 1825942
- PAR ID:
- 10293815
- Date Published:
- Journal Name:
- Journal of Neurophysiology
- Volume:
- 126
- Issue:
- 1
- ISSN:
- 0022-3077
- Page Range / eLocation ID:
- 195 to 212
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulled apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery.more » « less
-
People use their hands for intricate tasks like playing musical instruments, employing myriad touch sensations to inform motor control. In contrast, current prosthetic hands lack comprehensive haptic feedback and exhibit rudimentary multitasking functionality. Limited research has explored the potential of upper limb amputees to feel, perceive, and respond to multiple channels of simultaneously activated haptic feedback to concurrently control the individual fingers of dexterous prosthetic hands. This study introduces a novel control architecture for three amputees and nine additional subjects to concurrently control individual fingers of an artificial hand using two channels of context-specific haptic feedback. Artificial neural networks (ANNs) recognize subjects’ electromyogram (EMG) patterns governing the artificial hand controller. ANNs also classify the directions objects slip across tactile sensors on the robotic fingertips, which are encoded via the vibration frequency of wearable vibrotactile actuators. Subjects implement control strategies with each finger simultaneously to prevent or permit slip as desired, achieving a 94.49% ± 8.79% overall success rate. Although no statistically significant difference exists between amputees’ and non-amputees’ success rates, amputees require more time to respond to simultaneous haptic feedback signals, suggesting a higher cognitive load. Nevertheless, amputees can accurately interpret multiple channels of nuanced haptic feedback to concurrently control individual robotic fingers, addressing the challenge of multitasking with dexterous prosthetic hands.more » « less
-
Neuromuscular injuries can impair hand function and profoundly impacting the quality of life. This has motivated the development of advanced assistive robotic hands. However, the current neural decoder systems are limited in their ability to provide dexterous control of these robotic hands. In this study, we propose a novel method for predicting the extension and flexion force of three individual fingers concurrently using high-density electromyogram (HD-EMG) signals. Our method employs two deep forest models, the flexor decoder and the extensor decoder, to extract relevant representations from the EMG amplitude features. The outputs of the two decoders are integrated through linear regression to predict the forces of the three fingers. The proposed method was evaluated on data from three subjects and the results showed that it consistently outperforms the conventional EMG amplitude-based approach in terms of prediction error and robustness across both target and non-target fingers. This work presents a promising neural decoding approach for intuitive and dexterous control of the fingertip forces of assistive robotic hands.more » « less
-
Many manipulation tasks, such as placement or within-hand manipulation, require the object’s pose relative to a robot hand. The task is difficult when the hand significantly occludes the object. It is especially hard for adaptive hands, for which it is not easy to detect the finger’s configuration. In addition, RGB-only approaches face issues with texture-less objects or when the hand and the object look similar. This paper presents a depth-based framework, which aims for robust pose estimation and short response times. The approach detects the adaptive hand’s state via efficient parallel search given the highest overlap between the hand’s model and the point cloud. The hand’s point cloud is pruned and robust global registration is performed to generate object pose hypotheses, which are clustered. False hypotheses are pruned via physical reasoning. The remaining poses’ quality is evaluated given agreement with observed data. Extensive evaluation on synthetic and real data demonstrates the accuracy and computational efficiency of the framework when applied on challenging, highly-occluded scenarios for different object types. An ablation study identifies how the framework’s components help in performance. This work also provides a dataset for in-hand 6D object pose esti- mation. Code and dataset are available at: https://github. com/wenbowen123/icra20-hand-object-posemore » « less