- Award ID(s):
- 2015151
- NSF-PAR ID:
- 10337820
- Date Published:
- Journal Name:
- Scientific Reports
- Volume:
- 12
- Issue:
- 1
- ISSN:
- 2045-2322
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
null (Ed.)Holography is perhaps the only method demonstrated so far that can achieve a wide field of view (FOV) and a compact eyeglass-style form factor for augmented reality (AR) near-eye displays (NEDs). Unfortunately, the eyebox of such NEDs is impractically small ($\sim \lt$ 1 mm). In this paper, we introduce and demonstrate a design for holographic NEDs with a practical, wide eyebox of $\sim$ 10 mm and without any moving parts, based on holographic lenslets. In our design, a holographic optical element (HOE) based on a lenslet array was fabricated as the image combiner with expanded eyebox. A phase spatial light modulator (SLM) alters the phase of the incident laser light projected onto the HOE combiner such that the virtual image can be perceived at different focus distances, which can reduce the vergence-accommodation conflict (VAC). We have successfully implemented a bench-top prototype following the proposed design. The experimental results show effective eyebox expansion to a size of $\sim$ 10 mm. With further work, we hope that these design concepts can be incorporated into eyeglass-size NEDs.more » « less
-
Recently, augmented reality (AR) displays have attracted considerable attention due to the highly immersive and realistic viewer experience they can provide. One key challenge of AR displays is the fundamental trade-off between the extent of the field-of-view (FOV) and the size of the eyebox, set by the conservation of etendue sets this trade-off. Exit-pupil expansion (EPE) is one possible solution to this problem. However, it comes at the cost of distributing light over a larger area, decreasing the overall system's brightness. In this work, we show that the geometry of the waveguide and the in-coupler sets a fundamental limit on how efficient the combiner can be for a given FOV. This limit can be used as a tool for waveguide designers to benchmark the in-coupling efficiency of their in-coupler gratings. We design a metasurface-based grating (metagrating) and a commonly used SRG as in-couplers using the derived limit to guide optimization. We then compare the diffractive efficiencies of the two types of in-couplers to the theoretical efficiency limit. For our chosen waveguide geometry, the metagrating's 28% efficiency surpasses the SRG's 20% efficiency and nearly matches the geometry-based limit of 29% due to the superior angular response control of metasurfaces compared to SRGs. This work provides new insight into the efficiency limit of waveguide-based combiners and paves a novel path toward implementing metasurfaces in efficient waveguide AR displays.
-
Improved vergence and accommodation via Purkinje Image tracking with multiple cameras for AR glassesnull (Ed.)We present a personalized, comprehensive eye-tracking solution based on tracking higher-order Purkinje images, suited explicitly for eyeglasses-style AR and VR displays. Existing eye-tracking systems for near-eye applications are typically designed to work for an on-axis configuration and rely on pupil center and corneal reflections (PCCR) to estimate gaze with an accuracy of only about 0.5°to 1°. These are often expensive, bulky in form factor, and fail to estimate monocular accommodation, which is crucial for focus adjustment within the AR glasses. Our system independently measures the binocular vergence and monocular accommodation using higher-order Purkinje reflections from the eye, extending the PCCR based methods. We demonstrate that these reflections are sensitive to both gaze rotation and lens accommodation and model the Purkinje images’ behavior in simulation. We also design and fabricate a user-customized eye tracker using cheap off-the-shelf cameras and LEDs. We use an end-to-end convolutional neural network (CNN) for calibrating the eye tracker for the individual user, allowing for robust and simultaneous estimation of vergence and accommodation. Experimental results show that our solution, specifically catering to individual users, outperforms state-of-the-art methods for vergence and depth estimation, achieving an accuracy of 0.3782°and 1.108 cm respectively.more » « less
-
Three-dimensional (3D) vision in augmented reality (AR) displays can enable highly immersive and realistic viewer experience, hence, attracts much attention. Most current approaches create 3D vision by projecting stereoscopic images to different eyes using two separate projection systems, which are inevitably bulky for wearable devices. Here, we propose a compact stereo waveguide AR display system using a single piece of thin flat glass integrated with a polarization-multiplexed metagrating in-coupler and two diffractive grating out-couplers. Incident light of opposite circular polarization states carrying stereoscopic images are first steered by the metagrating in-coupler to opposite propagation directions in the flat glass waveguide, subsequently extracted by the diffractive grating out-couplers, and finally received by different eyes, forming 3D stereo vision. Experimentally, we fabricated a display prototype and demonstrated independent projection of two polarization-multiplexed stereoscopic images.more » « less
-
Abstract Metasurfaces offer complete control of optical wavefront at the subwavelength scale, advancing a new class of artificial planar optics, including lenses, waveplates, and holograms, with unprecedented merits over conventional optical components. In particular, the ultrathin, flat, and compact characteristics of metasurfaces facilitate their integration with semiconductor devices for the development of miniaturized and multifunctional optoelectronic systems. In this work, generation of structured light is implemented at an ultracompact wafer‐level through the monolithic integration of metasurface with standard vertical cavity surface‐emitting lasers (VCSELs). This work opens new perspectives for the design of structured light systems with compactness, lightweight, and scalability. Ultracompact beam structured laser chips with versatile functionalities are experimentally demonstrated, including multichannel beams array generation, on‐chip large‐angle beam steering up to 60°, and wafer‐level holographic beam shaping with a wide field of view (about 124°). The results will promote the development of compact light structuring systems with great potential in 3D imaging, displays, robotic vision, human–computer interaction, and augmented/virtual reality.