skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on January 1, 2026

Title: Investigating Passive Presentation Paradigms to Approximate Active Haptic Palpation
Active, exploratory touch supports human perception of a broad set of invisible physical surface properties. When traditionally hands-on tasks, such as medical palpation of soft tissue, are translated to virtual settings, haptic perception is throttled by technological limitations, and much of the richness of active exploration can be lost. The current research seeks to restore some of this richness with advanced methods of passively conveying haptic data alongside synchronized visual feeds. A robotic platform presented haptic stimulation modeled after the relative motion between a hypothetical physician's hands and artificial tissue samples during palpation. Performance in discriminating the sizes of hidden “tumors” in these samples was compared across display conditions which included haptic feedback and either: 1) synchronized video of the participant's hand, recorded during active exploration; 2) synchronized video of another person's hand; 3) no accompanying video. The addition of visual feedback did not improve task performance, which was similar whether receiving relative motion recorded from one's own hand or someone else's. While future research should explore additional strategies to improve task performance, this initial attempt to translate active haptic sensations to passive presentations indicates that visuo-haptic feedback can induce reliable haptic perceptions of motion in a stationary passive hand.  more » « less
Award ID(s):
2222918 2326453
PAR ID:
10580821
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE Transactions on Haptics
Volume:
18
Issue:
1
ISSN:
1939-1412
Page Range / eLocation ID:
208 to 219
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Medical palpation is a task that traditionally requires a skilled practitioner to assess and diagnose a patient through direct touch and manipulation of their body. In regions with a shortage of such professionals, robotic hands or sensorized gloves could potentially capture the necessary haptic information during palpation exams and relay it to medical doctors for diagnosis. From an engineering perspective, a comprehensive understanding of the relevant motions and forces is essential for designing haptic technologies capable of fully capturing this information. This study focuses on thyroid examination palpation, aiming to analyze the hand motions and forces applied to the patient’s skin during the procedure. We identified key palpation techniques through video recordings and interviews and measured the force characteristics during palpation performed by both non-medical participants and medical professionals. Our findings revealed five primary palpation hand motions and characterized the multi-dimensional interaction forces involved in these motions. These insights provide critical design guidelines for developing haptic sensing and display technologies optimized for remote thyroid nodule palpation and diagnosis. 
    more » « less
  2. null (Ed.)
    Haptic feedback allows an individual to identify various object properties. In this preliminary study, we determined the performance of stiffness recognition using transcutaneous nerve stimulation when a prosthetic hand was moved passively or was controlled actively by the subjects. Using a 2×8 electrode grid placed along the subject's upper arm, electrical stimulation was delivered to evoke somatotopic sensation along their index finger. Stimulation intensity, i.e. sensation strength, was modulated using the fingertip forces from a sensorized prosthetic hand. Object stiffness was encoded based on the rate of change of the evoked sensation as the prosthesis grasped one of three objects of different stiffness levels. During active control, sensation was modulated in real time as recorded forces were converted to stimulation amplitudes. During passive control, prerecorded force traces were randomly selected from a pool. Our results showed that the accuracy of object stiffness recognition was similar in both active and passive conditions. A slightly lower accuracy was observed during active control in one subject, which indicated that the sensorimotor integration processes could affect haptic perception for some users. 
    more » « less
  3. Abstract Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on fixation behavior as well as on the activeversuspassive nature of the head movement. Stationarity perception was measured by modifying the gain on visual motion relative to head movement on individual trials and asking subjects to report whether the gain was too low or too high. Fitting a psychometric function to the data yields two key parameters of performance. The mean is a measure of accuracy, and the standard deviation is a measure of precision. Experiments were conducted using a head-mounted display with fixation behavior monitored by an embedded eye tracker. During active conditions, subjects rotated their heads in yaw ∼15 deg/s over ∼1 s. Each subject’s movements were recorded and played backviarotating chair during the passive condition. During head-fixed and scene-fixed fixation the fixation target moved with the head or scene, respectively. Both precision and accuracy were better during active than passive head movement, likely due to increased precision on the head movement estimate arising from motor prediction and neck proprioception. Performance was also better during scene-fixed than head-fixed fixation, perhaps due to decreased velocity of retinal image motion and increased precision on the retinal image motion estimate. These results reveal how the nature of head and eye movements mediate encoding, processing, and comparison of relevant sensory and motor signals. 
    more » « less
  4. In this work, we investigate the influence of different visualizations on a manipulation task in virtual reality (VR). Without the haptic feedback of the real world, grasping in VR might result in intersections with virtual objects. As people are highly sensitive when it comes to perceiving collisions, it might look more appealing to avoid intersections and visualize non-colliding hand motions. However, correcting the position of the hand or fingers results in a visual-proprioceptive discrepancy and must be used with caution. Furthermore, the lack of haptic feedback in the virtual world might result in slower actions as a user might not know exactly when a grasp has occurred. This reduced performance could be remediated with adequate visual feedback. In this study, we analyze the performance, level of ownership, and user preference of eight different visual feedback techniques for virtual grasping. Three techniques show the tracked hand (with or without grasping feedback), even if it intersects with the grasped object. Another three techniques display a hand without intersections with the object, called outer hand, simulating the look of a real world interaction. One visualization is a compromise between the two groups, showing both a primary outer hand and a secondary tracked hand. Finally, in the last visualization the hand disappears during the grasping activity. In an experiment, users perform a pick-and-place task for each feedback technique. We use high fidelity marker-based hand tracking to control the virtual hands in real time. We found that the tracked hand visualizations result in better performance, however, the outer hand visualizations were preferred. We also find indications that ownership is higher with the outer hand visualizations. 
    more » « less
  5. Haptic feedback can render real-time force interactions with computer simulated objects. In several telerobotic applications, it is desired that a haptic simulation reflects a physical task space or interaction accurately. This is particularly true when excessive applied force can result in disastrous consequences, as with the case of robot-assisted minimally invasive surgery (RMIS) and tissue damage. Since force cannot be directly measured in RMIS, non-contact methods are desired. A promising direction of non-contact force estimation involves the primary use of vision sensors to estimate deformation. However, the required fidelity of non-contact force rendering of deformable interaction to maintain surgical operator performance is not well established. This work attempts to empirically evaluate the degree to which haptic feedback may deviate from ground truth yet result in acceptable teleoperated performance in a simulated RMIS-based palpation task. A preliminary user-study is conducted to verify the utility of the simulation platform, and the results of this work have implications in haptic feedback for RMIS and inform guidelines for vision-based tool-tissue force estimation. An adaptive thresholding method is used to collect the minimum and maximum tolerable errors in force orientation and magnitude of presented haptic feedback to maintain sufficient performance. 
    more » « less