skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 10:00 PM ET on Friday, December 8 until 2:00 AM ET on Saturday, December 9 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Bircher, Walter G."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We consider the problem of in-hand dexterous manipulation with a focus on unknown or uncertain hand–object parameters, such as hand configuration, object pose within hand, and contact positions. In particular, in this work we formulate a generic framework for hand–object configuration estimation using underactuated hands as an example. Owing to the passive reconfigurability and the lack of encoders in the hand’s joints, it is challenging to estimate, plan, and actively control underactuated manipulation. By modeling the grasp constraints, we present a particle filter-based framework to estimate the hand configuration. Specifically, given an arbitrary grasp, we start by sampling a set of hand configuration hypotheses and then randomly manipulate the object within the hand. While observing the object’s movements as evidence using an external camera, which is not necessarily calibrated with the hand frame, our estimator calculates the likelihood of each hypothesis to iteratively estimate the hand configuration. Once converged, the estimator is used to track the hand configuration in real time for future manipulations. Thereafter, we develop an algorithm to precisely plan and control the underactuated manipulation to move the grasped object to desired poses. In contrast to most other dexterous manipulation approaches, our framework does not require any tactile sensing or joint encoders, and can directly operate on any novel objects, without requiring a model of the object a priori. We implemented our framework on both the Yale Model O hand and the Yale T42 hand. The results show that the estimation is accurate for different objects, and that the framework can be easily adapted across different underactuated hand models. In the end, we evaluated our planning and control algorithm with handwriting tasks, and demonstrated the effectiveness of the proposed framework. 
    more » « less
  2. Humans use all surfaces of the hand for contact-rich manipulation. Robot hands, in contrast, typically use only the fingertips, which can limit dexterity. In this work, we leveraged a potential energy–based whole-hand manipulation model, which does not depend on contact wrench modeling like traditional approaches, to design a robotic manipulator. Inspired by robotic caging grasps and the high levels of dexterity observed in human manipulation, a metric was developed and used in conjunction with the manipulation model to design a two-fingered dexterous hand, the Model W. This was accomplished by simulating all planar finger topologies composed of open kinematic chains of up to three serial revolute and prismatic joints, forming symmetric two-fingered hands, and evaluating their performance according to the metric. We present the best design, an unconventional robot hand capable of performing continuous object reorientation, as well as repeatedly alternating between power and pinch grasps—two contact-rich skills that have often eluded robotic hands—and we experimentally characterize the hand’s manipulation capability. This hand realizes manipulation motions reminiscent of thumb–index finger manipulative movement in humans, and its topology provides the foundation for a general-purpose dexterous robot hand.

     
    more » « less
  3. The process of modeling a series of hand-object parameters is crucial for precise and controllable robotic in-hand manipulation because it enables the mapping from the hand’s actuation input to the object’s motion to be obtained. Without assuming that most of these model parameters are known a priori or can be easily estimated by sensors, we focus on equipping robots with the ability to actively self-identify necessary model parameters using minimal sensing. Here, we derive algorithms, on the basis of the concept of virtual linkage-based representations (VLRs), to self-identify the underlying mechanics of hand-object systems via exploratory manipulation actions and probabilistic reasoning and, in turn, show that the self-identified VLR can enable the control of precise in-hand manipulation. To validate our framework, we instantiated the proposed system on a Yale Model O hand without joint encoders or tactile sensors. The passive adaptability of the underactuated hand greatly facilitates the self-identification process, because they naturally secure stable hand-object interactions during random exploration. Relying solely on an in-hand camera, our system can effectively self-identify the VLRs, even when some fingers are replaced with novel designs. In addition, we show in-hand manipulation applications of handwriting, marble maze playing, and cup stacking to demonstrate the effectiveness of the VLR in precise in-hand manipulation control.

     
    more » « less