skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Visionless Tele-Exploration of 3D Moving Objects
This paper presents methods for improved teleoperation in dynamic environments in which the objects to be manipulated are moving, but vision may not meet size, biocompatibility, or maneuverability requirements. In such situations, the object could be tracked through non-geometric means, such as heat, radioactivity, or other markers. In order to safely explore a region, we use an optical time-of-flight pretouch sensor to detect (and range) target objects prior to contact. Information from these sensors is presented to the user via haptic virtual fixtures. This combination of techniques allows the teleoperator to “feel” the object without an actual contact event between the robot and the target object. Thus it provides the perceptual benefits of touch interaction to the operator, without incurring the negative consequences of the robot contacting unknown geometrical structures; premature contact can lead to damage or unwanted displacement of the target. The authors propose that as the geometry of the scene transitions from completely unknown to partially explored, haptic virtual fixtures can both prevent collisions and guide the user towards areas of interest, thus improving exploration speed. Experimental results show that for situations that are not amenable to vision, haptically-presented pretouch sensor information allows operators to more effectively explore moving objects.  more » « less
Award ID(s):
1832795 1427419
PAR ID:
10127833
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2018 IEEE International Conference on Robotics and Biomimetics (ROBIO)
Page Range / eLocation ID:
2238 to 2244
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Haptic feedback can render real-time force interactions with computer simulated objects. In several telerobotic applications, it is desired that a haptic simulation reflects a physical task space or interaction accurately. This is particularly true when excessive applied force can result in disastrous consequences, as with the case of robot-assisted minimally invasive surgery (RMIS) and tissue damage. Since force cannot be directly measured in RMIS, non-contact methods are desired. A promising direction of non-contact force estimation involves the primary use of vision sensors to estimate deformation. However, the required fidelity of non-contact force rendering of deformable interaction to maintain surgical operator performance is not well established. This work attempts to empirically evaluate the degree to which haptic feedback may deviate from ground truth yet result in acceptable teleoperated performance in a simulated RMIS-based palpation task. A preliminary user-study is conducted to verify the utility of the simulation platform, and the results of this work have implications in haptic feedback for RMIS and inform guidelines for vision-based tool-tissue force estimation. An adaptive thresholding method is used to collect the minimum and maximum tolerable errors in force orientation and magnitude of presented haptic feedback to maintain sufficient performance. 
    more » « less
  2. Tactile sensing has been increasingly utilized in robot control of unknown objects to infer physical properties and optimize manipulation. However, there is limited understanding about the contribution of different sensory modalities during interactive perception in complex interaction both in robots and in humans. This study investigated the effect of visual and haptic information on humans’ exploratory interactions with a ‘cup of coffee’, an object with nonlinear internal dynamics. Subjects were instructed to rhythmically transport a virtual cup with a rolling ball inside between two targets at a specified frequency, using a robotic interface. The cup and targets were displayed on a screen, and force feedback from the cup-andball dynamics was provided via the robotic manipulandum. Subjects were encouraged to explore and prepare the dynamics by “shaking” the cup-and-ball system to find the best initial conditions prior to the task. Two groups of subjects received the full haptic feedback about the cup-and-ball movement during the task; however, for one group the ball movement was visually occluded. Visual information about the ball movement had two distinctive effects on the performance: it reduced preparation time needed to understand the dynamics and, importantly, it led to simpler, more linear input-output interactions between hand and object. The results highlight how visual and haptic information regarding nonlinear internal dynamics have distinct roles for the interactive perception of complex objects. 
    more » « less
  3. The most common sensing modalities found in a robot perception system are vision and touch, which together can provide global and highly localized data for manipulation. However, these sensing modalities often fail to adequately capture the behavior of target objects during the critical moments as they transition out of static, controlled contact with an end-effector to dynamic and uncontrolled motion. In this work, we present a novel multimodal visuotactile sensor that provides simultaneous visuotactile and proximity depth data. The sensor integrates an RGB camera and air pressure sensor to sense touch with an infrared time-of-flight (ToF) camera to sense proximity by leveraging a selectively transmissive soft membrane to enable the dual sensing modalities. We present the mechanical design, fabrication techniques, algorithm implementations, and evaluation of the sensor's tactile and proximity modalities. The sensor is demonstrated in three open-loop robotic tasks: approaching and contacting an object, catching, and throwing. The fusion of tactile and proximity data could be used to capture key information about a target object's transition behavior for sensor-based control in dynamic manipulation. 
    more » « less
  4. Robot teleoperation is an emerging field of study with wide applications in exploration, manufacturing, and healthcare, because it allows users to perform complex remote tasks while remaining distanced and safe. Haptic feedback offers an immersive user experience and expands the range of tasks that can be accomplished through teleoperation. In this paper, we present a novel wearable haptic feedback device for a teleoperation system that applies kinesthetic force feedback to the fingers of a user. The proposed device, called a ‘haptic muscle’, is a soft pneumatic actuator constructed from a fabric-silicone composite in a toroidal structure. We explore the requirements of the ideal haptic feedback mechanism, construct several haptic muscles using different materials, and experimentally determine their dynamic pressure response as well as sensitivity (their ability to communicate small changes in haptic feedback). Finally, we integrate the haptic muscles into a data glove and a teleoperation system and perform several user tests. Our results show that most users could detect detect force changes as low as 3% of the working range of the haptic muscles. We also find that the haptic feedback causes users to apply up to 52% less force on an object while handling soft and fragile objects with a teleoperation system. 
    more » « less
  5. We present VRHapticDrones, a system utilizing quadcopters as levitating haptic feedback proxy. A touchable surface is attached to the side of the quadcopters to provide unintrusive, flexible, and programmable haptic feedback in virtual reality. Since the users' sense of presence in virtual reality is a crucial factor for the overall user experience, our system simulates haptic feedback of virtual objects. Quadcopters are dynamically positioned to provide haptic feedback relative to the physical interaction space of the user. In a first user study, we demonstrate that haptic feedback provided by VRHapticDrones significantly increases users' sense of presence compared to vibrotactile controllers and interactions without additional haptic feedback. In a second user study, we explored the quality of induced feedback regarding the expected feeling of different objects. Results show that VRHapticDrones is best suited to simulate objects that are expected to feel either light-weight or have yielding surfaces. With VRHapticDrones we contribute a solution to provide unintrusive and flexible feedback as well as insights for future VR haptic feedback systems. 
    more » « less