skip to main content


Title: Neural 3D holography: learning accurate wave propagation models for 3D holographic virtual and augmented reality displays
Holographic near-eye displays promise unprecedented capabilities for virtual and augmented reality (VR/AR) systems. The image quality achieved by current holographic displays, however, is limited by the wave propagation models used to simulate the physical optics. We propose a neural network-parameterized plane-to-multiplane wave propagation model that closes the gap between physics and simulation. Our model is automatically trained using camera feedback and it outperforms related techniques in 2D plane-to-plane settings by a large margin. Moreover, it is the first network-parameterized model to naturally extend to 3D settings, enabling high-quality 3D computer-generated holography using a novel phase regularization strategy of the complex-valued wave field. The efficacy of our approach is demonstrated through extensive experimental evaluation with both VR and optical see-through AR display prototypes.  more » « less
Award ID(s):
1839974
NSF-PAR ID:
10353643
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
ACM Transactions on Graphics
Volume:
40
Issue:
6
ISSN:
0730-0301
Page Range / eLocation ID:
1 to 12
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Digital holographic microscopy enables the 3D reconstruction of volumetric samples from a single-snapshot hologram. However, unlike a conventional bright-field microscopy image, the quality of holographic reconstructions is compromised by interference fringes as a result of twin images and out-of-plane objects. Here, we demonstrate that cross-modality deep learning using a generative adversarial network (GAN) can endow holographic images of a sample volume with bright-field microscopy contrast, combining the volumetric imaging capability of holography with the speckle- and artifact-free image contrast of incoherent bright-field microscopy. We illustrate the performance of this “bright-field holography” method through the snapshot imaging of bioaerosols distributed in 3D, matching the artifact-free image contrast and axial sectioning performance of a high-NA bright-field microscope. This data-driven deep-learning-based imaging method bridges the contrast gap between coherent and incoherent imaging, and enables the snapshot 3D imaging of objects with bright-field contrast from a single hologram, benefiting from the wave-propagation framework of holography.

     
    more » « less
  2. Computer-generated holography (CGH) holds transformative potential for a wide range of applications, including direct-view, virtual and augmented reality, and automotive display systems. While research on holographic displays has recently made impressive progress, image quality and eye safety of holographic displays are fundamentally limited by the speckle introduced by coherent light sources. Here, we develop an approach to CGH using partially coherent sources. For this purpose, we devise a wave propagation model for partially coherent light that is demonstrated in conjunction with a camera-in-the-loop calibration strategy. We evaluate this algorithm using light-emitting diodes (LEDs) and superluminescent LEDs (SLEDs) and demonstrate improved speckle characteristics of the resulting holograms compared with coherent lasers. SLEDs in particular are demonstrated to be promising light sources for holographic display applications, because of their potential to generate sharp and high-contrast two-dimensional (2D) and 3D images that are bright, eye safe, and almost free of speckle. 
    more » « less
  3. Wearable near-eye displays for virtual and augmented reality (VR/AR) have seen enormous growth in recent years. While researchers are exploiting a plethora of techniques to create life-like three-dimensional (3D) objects, there is a lack of awareness of the role of human perception in guiding the hardware development. An ultimate VR/AR headset must integrate the display, sensors, and processors in a compact enclosure that people can comfortably wear for a long time while allowing a superior immersion experience and user-friendly human–computer interaction. Compared with other 3D displays, the holographic display has unique advantages in providing natural depth cues and correcting eye aberrations. Therefore, it holds great promise to be the enabling technology for next-generation VR/AR devices. In this review, we survey the recent progress in holographic near-eye displays from the human-centric perspective.

     
    more » « less
  4. ABSTRACT

    We map the 3D kinematics of the Galactic disc out to 3.5 kpc from the Sun, and within 0.75 kpc from the mid-plane of the Galaxy. To this end, we combine high-quality astrometry from Gaia EDR3, with heliocentric line-of-sight velocities from Gaia DR2, and spectroscopic surveys including APOGEE, GALAH, and LAMOST. We construct an axisymmetric model for the mean velocity field, and subtract this on a star-by-star basis to obtain the residual velocity field in the Galactocentric components (Vϕ, VR, Vz), and Vlos. The velocity residuals are quantified using the power spectrum, and we find that the peak power (A/[km s−1]) in the mid-plane (|z| < 0.25 kpc) is (Aϕ, AR, AZ, Alos) = (4.2,8.5,2.6,4.6), at 0.25 < |z|/[kpc] < 0.5, is (Aϕ, AR, AZ, Alos) = (4.0,7.9,3.6,5.3), and at 0.5 < |z|/[kpc] < 0.75, is (Aϕ, AR, AZ, Alos) = (1.9,6.9,5.2,6.4). Our results provide a sophisticated measurement of the streaming motion in the disc and in the individual components. We find that streaming is most significant in VR, and at all heights (|Z|) probed, but is also non-negligible in other components. Additionally, we find that patterns in velocity field overlap spatially with models for spiral arms in the Galaxy. Our simulations show that phase-mixing of disrupting spiral arms can generate such residuals in the velocity field, where the radial component is dominant, just as in real data. We also find that with time evolution, both the amplitude and physical scale of the residual motion decrease.

     
    more » « less
  5. Abstract Augmented reality (AR) devices, as smart glasses, enable users to see both the real world and virtual images simultaneously, contributing to an immersive experience in interactions and visualization. Recently, to reduce the size and weight of smart glasses, waveguides incorporating holographic optical elements in the form of advanced grating structures have been utilized to provide light-weight solutions instead of bulky helmet-type headsets. However current waveguide displays often have limited display resolution, efficiency and field-of-view, with complex multi-step fabrication processes of lower yield. In addition, current AR displays often have vergence-accommodation conflict in the augmented and virtual images, resulting in focusing-visual fatigue and eye strain. Here we report metasurface optical elements designed and experimentally implemented as a platform solution to overcome these limitations. Through careful dispersion control in the excited propagation and diffraction modes, we design and implement our high-resolution full-color prototype, via the combination of analytical–numerical simulations, nanofabrication and device measurements. With the metasurface control of the light propagation, our prototype device achieves a 1080-pixel resolution, a field-of-view more than 40°, an overall input–output efficiency more than 1%, and addresses the vergence-accommodation conflict through our focal-free implementation. Furthermore, our AR waveguide is achieved in a single metasurface-waveguide layer, aiding the scalability and process yield control. 
    more » « less