skip to main content


Title: SWISH: A Shifting-Weight Interface of Simulated Hydrodynamics for Haptic Perception of Virtual Fluid Vessels
Current VR/AR systems are unable to reproduce the physical sensation of fluid vessels, due to the shifting nature of fluid motion. To this end, we introduce SWISH, an ungrounded mixed-reality interface, capable of affording the users a realistic haptic sensation of fluid behaviors in vessels. The chief mechanism behind SWISH is in the use of virtual reality tracking and motor actuation to actively relocate the center of gravity of a handheld vessel, emulating the moving center of gravity of a handheld vessel that contains fluid. In addition to solving challenges related to reliable and efficient motor actuation, our SWISH designs place an emphasis on reproducibility, scalability, and availability to the maker culture. Our virtual-to-physical coupling uses Nvidia Flex's Unity integration for virtual fluid dynamics with a 3D printed augmented vessel containing a motorized mechanical actuation system. To evaluate the effectiveness and perceptual efficacy of SWISH, we conduct a user study with 24 participants, 7 vessel actions, and 2 virtual fluid viscosities in a virtual reality environment. In all cases, the users on average reported that the SWISH bucket generates accurate tactile sensations for the fluid behavior. This opens the potential for multi-modal interactions with programmable fluids in virtual environments for chemistry education, worker training, and immersive entertainment.  more » « less
Award ID(s):
1917912
NSF-PAR ID:
10179716
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology
Page Range / eLocation ID:
751 to 761
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. What we feel from handling liquids in vessels produces unmistakably fluid tactile sensations. These stimulate essential perceptions in home, laboratory, or industrial contexts. Feeling fluid interactions from virtual fluids would similarly enrich experiences in virtual reality. We introduce Geppetteau, a novel string-driven weight shifting mechanism capable of providing perceivable tactile sensations of handling virtual liquids within a variety of vessel shapes. These mechanisms widen the range of augmentable shapes beyond the state-of-the-art of existing mechanical systems. In this work, Geppetteau is integrated into conical, spherical, cylindrical, and cuboid shaped vessels. Variations of these shapes are often used for fluid containers in our day-to-day. We studied the effectiveness of Geppetteau in simulating fine and coarse-grained tactile sensations of virtual liquids across three user studies. Participants found Geppetteau successful in providing congruent physical sensations of handling virtual liquids in a variety of physical vessel shapes and virtual liquid volumes and viscosities. 
    more » « less
  2. The emerging possibilities of data analysis and exploration in virtual reality raise the question of how users can be best supported during such interactions. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. This work is motivated by the prospect to create in VR a low-cost, tangible, robust, handheld spherical display that would be difficult or impossible to implement as a physical display. Our concept enables it to gain insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection. 
    more » « less
  3. Experiences of Virtual Reality training and architectural virtual environments benefit when provided a higher sensation of stair climbing. Passive haptic props can add that sensation. These methods present a safe approach by placing short ramps on the floor rather than a physical staircase. To improve a user’s level of immersion, we conducted an experiment to explore the shape of physical props to change the way users were aligned and moved while traveling up or down a virtual set of stairs. We investigated three methods for physical props while ascending and descending virtual stairs. Results suggest that elongated props provide a better experience and are more preferred. 
    more » « less
  4. Exoskeleton as a human augmentation technology has shown a great potential for transforming the future civil engineering operations. However, the inappropriate use of exoskeleton could cause injuries and damages if the user is not well trained. An effective procedural and operational training will make users more aware of the capabilities, restrictions and risks associated with exoskeleton in civil engineering operations. At present, the low availability and high cost of exoskeleton systems make hands-on training less feasible. In addition, different designs of exoskeleton correspond with different activation procedures, muscular engagement and motion boundaries, posing further challenges to exoskeleton training. We propose an “sensation transfer” approach that migrates the physical experience of wearing a real exoskeleton system to first-time users via a passive haptic system in an immersive virtual environment. The body motion and muscular engagement data of 15 experienced exoskeleton users were recorded and replayed in a virtual reality environment. Then a set of haptic devices on key parts of the body (shoulders, elbows, hands, and waist) generate different patterns of haptic cues depending on the trainees’ accuracy of mimicking the actions. The sensation transfer method will enhance the haptic learning experience and therefore accelerate the training. 
    more » « less
  5. Background Sustained engagement is essential for the success of telerehabilitation programs. However, patients’ lack of motivation and adherence could undermine these goals. To overcome this challenge, physical exercises have often been gamified. Building on the advantages of serious games, we propose a citizen science–based approach in which patients perform scientific tasks by using interactive interfaces and help advance scientific causes of their choice. This approach capitalizes on human intellect and benevolence while promoting learning. To further enhance engagement, we propose performing citizen science activities in immersive media, such as virtual reality (VR). Objective This study aims to present a novel methodology to facilitate the remote identification and classification of human movements for the automatic assessment of motor performance in telerehabilitation. The data-driven approach is presented in the context of a citizen science software dedicated to bimanual training in VR. Specifically, users interact with the interface and make contributions to an environmental citizen science project while moving both arms in concert. Methods In all, 9 healthy individuals interacted with the citizen science software by using a commercial VR gaming device. The software included a calibration phase to evaluate the users’ range of motion along the 3 anatomical planes of motion and to adapt the sensitivity of the software’s response to their movements. During calibration, the time series of the users’ movements were recorded by the sensors embedded in the device. We performed principal component analysis to identify salient features of movements and then applied a bagged trees ensemble classifier to classify the movements. Results The classification achieved high performance, reaching 99.9% accuracy. Among the movements, elbow flexion was the most accurately classified movement (99.2%), and horizontal shoulder abduction to the right side of the body was the most misclassified movement (98.8%). Conclusions Coordinated bimanual movements in VR can be classified with high accuracy. Our findings lay the foundation for the development of motion analysis algorithms in VR-mediated telerehabilitation. 
    more » « less