skip to main content

Search for: All records

Creators/Authors contains: "Pollard, N."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We present a novel, low cost framework for reconstructing surface contact movements during in-hand manipulations. Unlike many existing methods focused on hand pose tracking, ours models the behavior of contact patches, and by doing so is the first to obtain detailed contact tracking estimates for multi-contact manipulations. Our framework is highly accessible, requiring only low cost, readily available paint materials, a single RGBD camera, and a simple, deterministic interpolation algorithm. Despite its simplicity, we demonstrate the framework’s effectiveness over the course of several manipulations on three common household items. Finally, we demonstrate the use of a generated contact time seriesmore »in manipulation learning for a simulated robot hand.« less
    Free, publicly-accessible full text available October 1, 2022
  2. There has been an explosion of ideas in soft robotics over the past decade, resulting in unprecedented opportunities for end effector design. Soft robot hands offer benefits of low-cost, compliance, and customized design, with the promise of dexterity and robustness. The space of opportunities is vast and exciting. However, new tools are needed to understand the capabilities of such manipulators and to facilitate manipulation planning with soft manipulators that exhibit free-form deformations. To address this challenge, we introduce a sampling based approach to discover and model continuous families of manipulations for soft robot hands. We give an overview of themore »soft foam robots in production in our lab and describe novel algorithms developed to characterize manipulation families for such robots. Our approach consists of sampling a space of manipulation actions, constructing Gaussian Mixture Model representations covering successful regions, and refining the results to create continuous successful regions representing the manipulation family. The space of manipulation actions is very high dimensional; we consider models with and without dimensionality reduction and provide a rigorous approach to compare models across different dimensions by comparing coverage of an unbiased test dataset in the full dimensional parameter space. Results show that some dimensionality reduction is typically useful in populating the models, but without our technique, the amount of dimensionality reduction to use is difficult to predict ahead of time and can depend on the hand and task. The models we produce can be used to plan and carry out successful, robust manipulation actions and to compare competing robot hand designs.« less
  3. Achieving dexterous in-hand manipulation with robot hands is an extremely challenging problem, in part due to current limitations in hardware design. One notable bottleneck hampering the development of improved hardware for dexterous manipulation is the lack of a standardized benchmark for evaluating in-hand dexterity. In order to address this issue, we establish a new benchmark for evaluating in- hand dexterity, specifically for humanoid type robot hands: the Elliott and Connolly Benchmark. This benchmark is based on a classification of human manipulations established by Elliott and Connolly, and consists of 13 distinct in-hand manipulation patterns. We define qualitative and quantitative metricsmore »for evaluation of the benchmark, and provide a detailed testing protocol. Additionally, we introduce a dexterous robot hand - the CMU Foam Hand III - which is evaluated using the benchmark, successfully completing 10 of the 13 manipulation patterns and outperforming human hand baseline results for several of the patterns.« less