Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
We propose a haptic device that alters the perceived softness of real rigid objects without requiring to instrument the objects. Instead, our haptic device works by restricting the user's fingerpad lateral deformation via a hollow frame that squeezes the sides of the fingerpad. This causes the fingerpad to become bulgier than it originally was—when users touch an object's surface with their now-restricted fingerpad, they feel the object to be softer than it is. To illustrate the extent of softness illusion induced by our device, touching the tip of a wooden chopstick will feel as soft as a rubber eraser. Our haptic device operates by pulling the hollow frame using a motor. Unlike most wearable haptic devices, which cover up the user's fingerpad to create force sensations, our device creates softness while leaving the center of the fingerpad free, which allows the users to feel most of the object they are interacting with. This makes our device a unique contribution to altering the softness of everyday objects, creating “buttons” by softening protrusions of existing appliances or tangibles, or even, altering the softness of handheld props for VR. Finally, we validated our device through two studies: (1) a psychophysics study showed that the device brings down the perceived softness of any object between 50A-90A to around 40A (on Shore A hardness scale); and (2) a user study demonstrated that participants preferred our device for interactive applications that leverage haptic props, such as making a VR prop feel softer or making a rigid 3D printed remote control feel softer on its button.more » « less
-
Electrical muscle stimulation (EMS) is an emergent technique that miniaturizes force feedback, especially popular for untethered haptic devices, such as mobile gaming, VR, or AR. However, the actuation displayed by interactive systems based on EMS is coarse and imprecise. EMS systems mostly focus on inducing movements in large muscle groups such as legs, arms, and wrists; whereas individual finger poses, which would be required, for example, to actuate a user's fingers to fingerspell even the simplest letters in sign language, are not possible. The lack of dexterity in EMS stems from two fundamental limitations: (1) lack of independence: when a particular finger is actuated by EMS, the current runs through nearby muscles, causing unwanted actuation of adjacent fingers; and, (2) unwanted oscillations: while it is relatively easy for EMS to start moving a finger, it is very hard for EMS to stop and hold that finger at a precise angle; because, to stop a finger, virtually all EMS systems contract the opposing muscle, typically achieved via controllers (e.g., PID)—unfortunately, even with the best controller tuning, this often results in unwanted oscillations. To tackle these limitations, we propose dextrEMS, an EMS-based haptic device featuring mechanical brakes attached to each finger joint. The key idea behind dextrEMS is that while the EMS actuates the fingers, it is our mechanical brake that stops the finger in a precise position. Moreover, it is also the brakes that allow dextrEMS to select which fingers are moved by EMS, eliminating unwanted movements by preventing adjacent fingers from moving. We implemented dextrEMS as an untethered haptic device, weighing only 68g, that actuates eight finger joints independently (metacarpophalangeal and proximal interphalangeal joints for four fingers), which we demonstrate in a wide range of haptic applications, such as assisted fingerspelling, a piano tutorial, guitar tutorial, and a VR game. Finally, in our technical evaluation, we found that dextrEMS outperformed EMS alone by doubling its independence and reducing unwanted oscillations.more » « less
-
Improved vergence and accommodation via Purkinje Image tracking with multiple cameras for AR glassesnull (Ed.)We present a personalized, comprehensive eye-tracking solution based on tracking higher-order Purkinje images, suited explicitly for eyeglasses-style AR and VR displays. Existing eye-tracking systems for near-eye applications are typically designed to work for an on-axis configuration and rely on pupil center and corneal reflections (PCCR) to estimate gaze with an accuracy of only about 0.5°to 1°. These are often expensive, bulky in form factor, and fail to estimate monocular accommodation, which is crucial for focus adjustment within the AR glasses. Our system independently measures the binocular vergence and monocular accommodation using higher-order Purkinje reflections from the eye, extending the PCCR based methods. We demonstrate that these reflections are sensitive to both gaze rotation and lens accommodation and model the Purkinje images’ behavior in simulation. We also design and fabricate a user-customized eye tracker using cheap off-the-shelf cameras and LEDs. We use an end-to-end convolutional neural network (CNN) for calibrating the eye tracker for the individual user, allowing for robust and simultaneous estimation of vergence and accommodation. Experimental results show that our solution, specifically catering to individual users, outperforms state-of-the-art methods for vergence and depth estimation, achieving an accuracy of 0.3782°and 1.108 cm respectively.more » « less