skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on February 28, 2026

Title: A sensor-actuator–coupled gustatory interface chemically connecting virtual and real environments for remote tasting
Recent advancements in virtual reality (VR) and augmented reality (AR) have strengthened the bridge between virtual and real worlds via human-machine interfaces. Despite extensive research into biophysical signals, gustation, a fundamental component of the five senses, has experienced limited progress. This work reports a bio-integrated gustatory interface, “e-Taste,” to address the underrepresented chemical dimension in current VR/AR technologies. This system facilitates remote perception and replication of taste sensations through the coupling of physically separated sensors and actuators with wireless communication modules. By using chemicals representing five basic tastes, systematic codesign of key functional components yields reliable performance including tunability, versatility, safety, and mechanical robustness. Field testing involving human subjects focusing on user perception confirms its proficiency in digitally simulating a range of taste intensities and combinations. Overall, this investigation pioneers a chemical dimension in AR/VR technology, paving the way for users to transcend visual and auditory virtual engagements by integrating the taste sensation into virtual environment for enhanced digital experiences.  more » « less
Award ID(s):
2223387
PAR ID:
10653669
Author(s) / Creator(s):
 ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  
Publisher / Repository:
AAAS
Date Published:
Journal Name:
Science Advances
Volume:
11
Issue:
9
ISSN:
2375-2548
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective. 
    more » « less
  2. Abstract The advancement in virtual reality/augmented reality (VR/AR) has been achieved by breakthroughs in the realistic perception of virtual elements. Although VR/AR technology is advancing fast, enhanced sensor functions, long‐term wearability, and seamless integration with other electronic components are still required for more natural interactions with the virtual world. Here, this report reviews the recent advances in multifunctional wearable sensors and integrated functional devices for VR/AR applications. Specified device designs, packaging strategies, and interactive physiological sensors are summarized based on their methodological approaches for sensory inputs and virtual feedback. In addition, limitations of the existing systems, key challenges, and future directions are discussed. It is envisioned that this progress report's outcomes will expand the insights on wearable functional sensors and device interfaces toward next‐generation VR/AR technologies. 
    more » « less
  3. Green, Phil (Ed.)
    Head‐mounted virtual reality (VR) and augmented reality (AR) systems deliver colour imagery directly to a user's eyes, presenting position‐aware, real‐time computer graphics to create the illusion of interacting with a virtual world. In some respects, colour in AR and VR can be modelled and controlled much like colour in other display technologies. However, it is complicated by the optics required for near‐eye display, and in the case of AR, by the merging of real‐world and virtual visual stimuli. Methods have been developed to provide predictable colour in VR, and ongoing research has exposed details of the visual perception of real and virtual in AR. Yet, more work is required to make colour appearance predictable and AR and VR display systems more robust. 
    more » « less
  4. Augmented reality (AR) is a technology that integrates 3D virtual objects into the physical world in real-time, while virtual reality (VR) is a technology that immerses users in an interactive 3D virtual environment. The fast development of augmented reality (AR) and virtual reality (VR) technologies has reshaped how people interact with the physical world. This presentation will outline the results from two unique AR and one Web-based VR coastal engineering projects, motivating the next stage in the development of the augmented reality package for coastal students, engineers, and planners. 
    more » « less
  5. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less