skip to main content


Title: Stability and Predictability in Dynamically Complex Physical Interactions
This study examines human control of physical interaction with objects that exhibit complex (nonlinear, chaotic, underactuated) dynamics. We hypothesized that humans exploited stability properties of the human-object interaction. Using a simplified 2D model for carrying a “cup of coffee”, we developed a virtual implementation to identify human control strategies. Transporting a cup of coffee was modeled as a cart with a suspended pendulum, where humans moved the cart on a horizontal line via a robotic manipulandum. The specific task was to transport the cart-pendulum system to a target, as fast as possible, while accommodating assistive and resistive perturbations. To assess trajectory stability, we applied contraction analysis. We showed that when the perturbation was assistive, humans absorbed the perturbation by controlling cart trajectories into a contraction region prior to the perturbation. When the perturbation was resistive, subjects passed through a contraction region following the perturbation. Entering a contraction region stabilizes performance and makes the dynamics more predictable. This human control strategy could inspire more robust control strategies for physical interaction in robots.  more » « less
Award ID(s):
1637824
NSF-PAR ID:
10082086
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2018 IEEE International Conference on Robotics and Automation (ICRA)
Page Range / eLocation ID:
1 to 5
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This study examines human control of physical interaction with objects that exhibit complex (nonlinear, chaotic, underactuated) dynamics. We hypothesized that humans exploited stability properties of the human-object interaction. Using a simplified 2D model for carrying a “cup of coffee”, we developed a virtual implementation to identify human control strategies. Transporting a cup of coffee was modeled as a cart with a suspended pendulum, where humans moved the cart on a horizontal line via a robotic manipulandum. The specific task was to transport the cart-pendulum system to a target, as fast as possible, while accommodating assistive and resistive perturbations. To assess trajectory stability, we applied contraction analysis. We showed that when the perturbation was assistive, humans absorbed the perturbation by controlling cart trajectories into a contraction region prior to the perturbation. When the perturbation was resistive, subjects passed through a contraction region following the perturbation. Entering a contraction region stabilizes performance and makes the dynamics more predictable. This human control strategy could inspire more robust control strategies for physical interaction in robots. 
    more » « less
  2. Tactile sensing has been increasingly utilized in robot control of unknown objects to infer physical properties and optimize manipulation. However, there is limited understanding about the contribution of different sensory modalities during interactive perception in complex interaction both in robots and in humans. This study investigated the effect of visual and haptic information on humans’ exploratory interactions with a ‘cup of coffee’, an object with nonlinear internal dynamics. Subjects were instructed to rhythmically transport a virtual cup with a rolling ball inside between two targets at a specified frequency, using a robotic interface. The cup and targets were displayed on a screen, and force feedback from the cup-andball dynamics was provided via the robotic manipulandum. Subjects were encouraged to explore and prepare the dynamics by “shaking” the cup-and-ball system to find the best initial conditions prior to the task. Two groups of subjects received the full haptic feedback about the cup-and-ball movement during the task; however, for one group the ball movement was visually occluded. Visual information about the ball movement had two distinctive effects on the performance: it reduced preparation time needed to understand the dynamics and, importantly, it led to simpler, more linear input-output interactions between hand and object. The results highlight how visual and haptic information regarding nonlinear internal dynamics have distinct roles for the interactive perception of complex objects. 
    more » « less
  3. Abstract Background

    Maintaining upright posture is an unstable task that requires sophisticated neuro-muscular control. Humans use foot–ground interaction forces, characterized by point of application, magnitude, and direction to manage body accelerations. When analyzing the directions of the ground reaction forces of standing humans in the frequency domain, previous work found a consistent pattern in different frequency bands. To test whether this frequency-dependent behavior provided a distinctive signature of neural control or was a necessary consequence of biomechanics, this study simulated quiet standing and compared the results with human subject data.

    Methods

    Aiming to develop the simplest competent and neuromechanically justifiable dynamic model that could account for the pattern observed across multiple subjects, we first explored the minimum number of degrees of freedom required for the model. Then, we applied a well-established optimal control method that was parameterized to maximize physiologically-relevant insight to stabilize the balancing model.

    Results

    If a standing human was modeled as a single inverted pendulum, no controller could reproduce the experimentally observed pattern. The simplest competent model that approximated a standing human was a double inverted pendulum with torque-actuated ankle and hip joints. A range of controller parameters could stabilize this model and reproduce the general trend observed in experimental data; this result seems to indicate a biomechanical constraint and not a consequence of control. However, details of the frequency-dependent pattern varied substantially across tested control parameter values. The set of parameters that best reproduced the human experimental results suggests that the control strategy employed by human subjects to maintain quiet standing was best described by minimal control effort with an emphasis on ankle torque.

    Conclusions

    The findings suggest that the frequency-dependent pattern of ground reaction forces observed in quiet standing conveys quantitative information about human control strategies. This study’s method might be extended to investigate human neural control strategies in different contexts of balance, such as with an assistive device or in neurologically impaired subjects.

     
    more » « less
  4. We propose a novel criterion for evaluating user input for human-robot interfaces for known tasks. We use the mode insertion gradient (MIG)—a tool from hybrid control theory—as a filtering criterion that instantaneously assesses the impact of user actions on a dynamic system over a time window into the future. As a result, the filter is permissive to many chosen strategies, minimally engaging, and skill-sensitive—qualities desired when evaluating human actions. Through a human study with 28 healthy volunteers, we show that the criterion exhibits a low, but significant, negative correlation between skill level, as estimated from task-specific measures in unassisted trials, and the rate of controller intervention during assistance. Moreover, a MIG-based filter can be utilized to create a shared control scheme for training or assistance. In the human study, we observe a substantial training effect when using a MIG-based filter to perform cart-pendulum inversion, particularly when comparing improvement via the RMS error measure. Using simulation of a controlled spring-loaded inverted pendulum (SLIP) as a test case, we observe that the MIG criterion could be used for assistance to guarantee either task completion or safety of a joint human-robot system, while maintaining the system’s flexibility with respect to user-chosen strategies. 
    more » « less
  5. Effective physical human-robot interaction (pHRI) depends on how humans can communicate their intentions for movement with others. While it is speculated that small interaction forces contain significant information to convey the specific movement intention of physical humanhuman interaction (pHHI), the underlying mechanism for humans to infer intention from such small forces is largely unknown. The hypothesis in this work is that the sensitivity to a small interaction force applied at the hand is affected by the movement of the arm that is affected by the arm stiffness. For this, a haptic robot was used to provide the endpoint interaction forces to the arm of seated human participants. They were asked to determine one of the four directions of the applied robot interaction force without visual feedback. Variations of levels of interaction force as well as arm muscle contraction were applied. The results imply that human’s ability to identify and respond to the correct direction of small interaction forces was lower when the alignment of human arm movement with respect to the force direction was higher. In addition, the sensitivity to the direction of the small interaction force was high when the arm stiffness was low. It is also speculated that humans lower their arm stiffness to be more sensitive to smaller interaction forces. These results will help develop human-like pHRI systems for various applications. 
    more » « less