skip to main content


Search for: All records

Award ID contains: 1830242

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We provide methods which recover planar scene geometry by utilizing the transient histograms captured by a class of close-range time-of-flight (ToF) distance sensor. A transient histogram is a one dimensional temporal waveform which encodes the arrival time of photons incident on the ToF sensor. Typically, a sensor processes the transient histogram using a proprietary algorithm to produce distance estimates, which are commonly used in several robotics applications. Our methods utilize the transient histogram directly to enable recovery of planar geometry more accurately than is possible using only proprietary distance estimates, and consistent recovery of the albedo of the planar surface, which is not possible with proprietary distance estimates alone. This is accomplished via a differentiable rendering pipeline, which simulates the transient imaging process, allowing direct optimization of scene geometry to match observations. To validate our methods, we capture 3,800 measurements of eight planar surfaces from a wide range of viewpoints, and show that our method outperforms the proprietary-distance-estimate baseline by an order of magnitude in most scenarios. We demonstrate a simple robotics application which uses our method to sense the distance to and slope of a planar surface from a sensor mounted on the end effector of a robot arm. 
    more » « less
    Free, publicly-accessible full text available October 1, 2024
  2. We explore task tolerances, i.e., allowable position or rotation inaccuracy, as an important resource to facilitate smooth and effective telemanipulation. Task tolerances provide a robot flexibility to generate smooth and feasible motions; however, in teleoperation, this flexibility may make the user’s control less direct. In this work, we implemented a telema- nipulation system that allows a robot to autonomously adjust its configuration within task tolerances. We conducted a user study comparing a telemanipulation paradigm that exploits task tolerances (functional mimicry) to a paradigm that requires the robot to exactly mimic its human operator (exact mimicry), and assess how the choice in paradigm shapes user experience and task performance. Our results show that autonomous adjustments within task tolerances can lead to performance improvements without sacrificing perceived control of the robot. Additionally, we find that users perceive the robot to be more under control, predictable, fluent, and trustworthy in functional mimicry than in exact mimicry. 
    more » « less
    Free, publicly-accessible full text available October 1, 2024
  3. We investigate how robotic camera systems can offer new capabilities to computer-supported cooperative work through the design, development, and evaluation of a prototype system called Periscope. With Periscope, a local worker completes manipulation tasks with guidance from a remote helper who observes the workspace through a camera mounted on a semi-autonomous robotic arm that is co-located with the worker. Our key insight is that the helper, the worker, and the robot should all share responsibility of the camera view-an approach we call shared camera control. Using this approach, we present a set of modes that distribute the control of the camera between the human collaborators and the autonomous robot depending on task needs. We demonstrate the system's utility and the promise of shared camera control through a preliminary study where 12 dyads collaboratively worked on assembly tasks. Finally, we discuss design and research implications of our work for future robotic camera systems that facilitate remote collaboration.

     
    more » « less
    Free, publicly-accessible full text available September 28, 2024
  4. Generating feasible robot motions in real-time requires achieving multiple tasks (i.e., kinematic requirements) simultaneously. These tasks can have a specific goal, a range of equally valid goals, or a range of acceptable goals with a preference toward a specific goal. To satisfy multiple and potentially competing tasks simultaneously, it is important to exploit the flexibility afforded by tasks with a range of goals. In this paper, we propose a real-time motion generation method that accommodates all three categories of tasks within a single, unified framework and leverages the flexibility of tasks with a range of goals to accommodate other tasks. Our method incorporates tasks in a weighted-sum multiple-objective optimization structure and uses barrier methods with novel loss functions to encode the valid range of a task. We demonstrate the effectiveness of our method through a simulation experiment that compares it to state-of-the-art alternative approaches, and by demonstrating it on a physical camera-in-hand robot that shows that our method enables the robot to achieve smooth and feasible camera motions. 
    more » « less
  5. Robotic technology can support the creation of new tools that improve the creative process of cinematography. It is crucial to consider the specific requirements and perspectives of industry professionals when designing and developing these tools. In this paper, we present the results from exploratory interviews with three cinematography practitioners, which included a demonstration of a prototype robotic system. We identified many factors that can impact the design, adoption, and use of robotic support for cinematography, including: (1) the ability to meet requirements for cost, quality, mobility, creativity, and reliability; (2) the compatibility and integration of tools with existing workflows, equipment, and software; and (3) the potential for new creative opportunities that robotic technology can open up. Our findings provide a starting point for future co-design projects that aim to support the work of cinematographers with collaborative robots. 
    more » « less
  6. Collaborative robots have the potential to be intelligent, embodied agents that can contribute to remote human collaboration. We explore this paradigm through the design of robot-mounted camera systems for remote assistance. In this extended abstract, we discuss our iterative design process to develop interaction techniques that leverage shared control-based methods to distribute camera control between the agentic robot and human collaborators. 
    more » « less
  7. Despite the large amount of research on kinesthetic haptic devices and haptic effect modeling, there is limited work assessing the perceived realism of kinesthetic model renderings. Identifying the impact of haptic effect parameters in perceived realism can help to inform the required accuracy of kinesthetic renderings. In this work, we model common kinesthetic haptic effects and evaluate the perceived realism of varying model parameters via a user study. Our results suggest that parameter accuracy requirements to achieve realistic ratings vary depending on the specific haptic parameter. 
    more » « less
  8. Many applications in robotics require computing a robot manipulator's "proximity" to a collision state in a given configuration. This collision proximity is commonly framed as a summation over closest Euclidean distances between many pairs of rigid shapes in a scene. Computing many such pairwise distances is inefficient, while more efficient approximations of this procedure, such as through supervised learning, lack accuracy and robustness. In this work, we present an approach for computing a collision proximity function for robot manipulators that formalizes the trade-off between efficiency and accuracy and provides an algorithm that gives control over it. Our algorithm, called Proxima, works in one of two ways: (1) given a time budget as input, the algorithm returns an as-accurate-as-possible proximity approximation value in this time; or (2) given an accuracy budget, the algorithm returns an as-fast-as-possible proximity approximation value that is within the given accuracy bounds. We show the robustness of our approach through analytical investigation and simulation experiments on a wide set of robot models ranging from 6 to 132 degrees of freedom. We demonstrate that controlling the trade-off between efficiency and accuracy in proximity computations via our approach can enable safe and accurate real-time robot motion-optimization even on high-dimensional robot models. 
    more » « less