skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Holography for automotive applications: from HUD to LIDAR
Holography can offer unique solutions to the specific problems faced by automotive optical systems. Frequently, when possibilities have been exhausted using refractive and refractive designs, diffraction can come to the rescue by opening a new dimension to explore. Holographic optical elements (HOEs), for example, are thin film optics that can advantageously replace lenses, prisms, or mirrors. Head up display (HUD) and LIDAR for autonomous vehicles are two of the systems where our group have used HOEs to provide original answers to the limitations of classical optic. With HUD, HOEs address the problems of the limited field of view, and small eye box usually found in projection systems. Our approach is to recycle the light multiple times inside a waveguide so the combiner can be as large as the entire windshield. In this system, a hologram is used to inject a small image at one end of a waveguide, and another hologram is used to extract the image several times, providing an expanded eye box. In the case of LIDAR systems, non-mechanical beam scanning based on diffractive spatial light modulator (SLM), are only able to achieve an angular range of few degrees. We used multiplexed volume holograms (VH) to amplify the initial diffraction angle from the SLM to achieve up to 4π steradian coverage in a compact form factor.  more » « less
Award ID(s):
1640329
PAR ID:
10089865
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Optical Data Storage 2018: Industrial Optical Devices and Systems;
Volume:
10757
Page Range / eLocation ID:
11
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Holography is perhaps the only method demonstrated so far that can achieve a wide field of view (FOV) and a compact eyeglass-style form factor for augmented reality (AR) near-eye displays (NEDs). Unfortunately, the eyebox of such NEDs is impractically small ($$\sim \lt$$ 1 mm). In this paper, we introduce and demonstrate a design for holographic NEDs with a practical, wide eyebox of $$\sim$$ 10 mm and without any moving parts, based on holographic lenslets. In our design, a holographic optical element (HOE) based on a lenslet array was fabricated as the image combiner with expanded eyebox. A phase spatial light modulator (SLM) alters the phase of the incident laser light projected onto the HOE combiner such that the virtual image can be perceived at different focus distances, which can reduce the vergence-accommodation conflict (VAC). We have successfully implemented a bench-top prototype following the proposed design. The experimental results show effective eyebox expansion to a size of $$\sim$$ 10 mm. With further work, we hope that these design concepts can be incorporated into eyeglass-size NEDs. 
    more » « less
  2. We report the simulation of an adaptive interferometric null test using a high-definition phase-only spatial light modulator (SLM) to measure form and mid spatial frequencies of a freeform mirror with a sag departure of 150 μm from its base sphere. A state-of-the-art commercial SLM is modeled as a reconfigurable phase computer generated hologram (CGH) that generates a nulling phase function with close to an order of magnitude higher amplitude than deformable mirrors. The theoretical uncertainty in form measurement arising from pixelation and phase quantization of the SLM is 50.62 nm RMS. The calibration requirements for hardware implementation are detailed. © 2019 Optical Society of America https://doi.org/10.1364/OL.44.002000 
    more » « less
  3. Abstract Frequency modulated continuous wave laser ranging (FMCW LiDAR) enables distance mapping with simultaneous position and velocity information, is immune to stray light, can achieve long range, operate in the eye-safe region of 1550 nm and achieve high sensitivity. Despite its advantages, it is compounded by the simultaneous requirement of both narrow linewidth low noise lasers that can be precisely chirped. While integrated silicon-based lasers, compatible with wafer scale manufacturing in large volumes at low cost, have experienced major advances and are now employed on a commercial scale in data centers, and impressive progress has led to integrated lasers with (ultra) narrow sub-100 Hz-level intrinsic linewidth based on optical feedback from photonic circuits, these lasers presently lack fast nonthermal tuning, i.e. frequency agility as required for coherent ranging. Here, we demonstrate a hybrid photonic integrated laser that exhibits very narrow intrinsic linewidth of 25 Hz while offering linear, hysteresis-free, and mode-hop-free-tuning beyond 1 GHz with up to megahertz actuation bandwidth constituting 1.6 × 1015Hz/s tuning speed. Our approach uses foundry-based technologies - ultralow-loss (1 dB/m) Si3N4photonic microresonators, combined with aluminium nitride (AlN) or lead zirconium titanate (PZT) microelectromechanical systems (MEMS) based stress-optic actuation. Electrically driven low-phase-noise lasing is attained by self-injection locking of an Indium Phosphide (InP) laser chip and only limited by fundamental thermo-refractive noise at mid-range offsets. By utilizing difference-drive and apodization of the photonic chip to suppress mechanical vibrations of the chip, a flat actuation response up to 10 MHz is achieved. We leverage this capability to demonstrate a compact coherent LiDAR engine that can generate up to 800 kHz FMCW triangular optical chirp signals, requiring neither any active linearization nor predistortion compensation, and perform a 10 m optical ranging experiment, with a resolution of 12.5 cm. Our results constitute a photonic integrated laser system for scenarios where high compactness, fast frequency actuation, and high spectral purity are required. 
    more » « less
  4. Active depth sensing achieves robust depth estimation but is usually limited by the sensing range. Naively increas- ing the optical power can improve sensing range but in- duces eye-safety concerns for many applications, including autonomous robots and augmented reality. In this paper, we propose an adaptive active depth sensor that jointly opti- mizes range, power consumption, and eye-safety. The main observation is that we need not project light patterns to the entire scene but only to small regions of interest where depth is necessary for the application and passive stereo depth es- timation fails. We theoretically compare this adaptive sens- ing scheme with other sensing strategies, such as full-frame projection, line scanning, and point scanning. We show that, to achieve the same maximum sensing distance, the proposed method consumes the least power while having the shortest (best) eye-safety distance. We implement this adaptive sensing scheme with two hardware prototypes, one with a phase-only spatial light modulator (SLM) and the other with a micro-electro-mechanical (MEMS) mirror and diffractive optical elements (DOE). Experimental results validate the advantage of our method and demonstrate its capability of acquiring higher quality geometry adaptively. Please see our project website for video results and code: https://btilmon.github.io/e3d.html 
    more » « less
  5. Yousefi, Reza (Ed.)
    Lenses are vital components of well-functioning eyes and are crafted through the precise arrangement of proteins to achieve transparency and refractive ability. In addition to optical clarity for minimal scatter and absorption, proper placement of the lens within the eye is equally important for the formation of sharp, focused images on the retina. Maintaining these states is challenging due to dynamic and substantial post-embryonic eye and lens growth. Here, we gain insights into required processes through exploring the optical and visual consequences of silencing a key lens constituent inThermonectus marmoratussunburst diving beetle larvae. Using RNAi, we knocked down Lens3, a widely expressed cuticular lens protein during a period of substantial growth of their camera-type principal eyes. We show thatlens3RNAi results in the formation of opacities reminiscent of vertebrate lens ‘cataracts’, causing the projection of blurry and degraded images. Consequences of this are exacerbated in low-light conditions, evidenced by impaired hunting behaviour in this visually guided predator. Notably, lens focal lengths remained unchanged, suggesting that power and overall structure are preserved despite the absence of this major component. Further, we did not detect significant shifts in thein-vivorefractive states of cataract-afflicted larvae. This in stark contrast with findings in vertebrates, in which form-deprivation or the attenuation of image contrast, results in the dysregulation of eye growth, causing refractive errors such as myopia. Our results provide insights into arthropod lens construction and align with previous findings which point towards visual input being inconsequential for maintaining correctly focused eyes in this group. Our findings highlight the utility ofT. marmoratusas a tractable model system to probe the aetiology of lens cataracts and refractive errors. 
    more » « less