skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, October 10 until 2:00 AM ET on Friday, October 11 due to maintenance. We apologize for the inconvenience.


Title: Demonstration of a photonic-lantern focal-plane wavefront sensor using fibre mode conversion and deep learning
A focal plane wavefront sensor offers major advantages to adaptive optics, including removal of non-commonpath error and providing sensitivity to blind modes (such as petalling). But simply using the observed point spread function (PSF) is not sufficient for wavefront correction, as only the intensity, not phase, is measured. Here we demonstrate the use of a multimode fiber mode converter (photonic lantern) to directly measure the wavefront phase and amplitude at the focal plane. Starlight is injected into a multimode fiber at the image plane, with the combination of modes excited within the fiber a function of the phase and amplitude of the incident wavefront. The fiber undergoes an adiabatic transition into a set of multiple, single-mode outputs, such that the distribution of intensities between them encodes the incident wavefront. The mapping (which may be strongly non-linear) between spatial modes in the PSF and the outputs is stable but must be learned. This is done by a deep neural network, trained by applying random combinations of spatial modes to the deformable mirror. Once trained, the neural network can instantaneously predict the incident wavefront for any set of output intensities. We demonstrate the successful reconstruction of wavefronts produced in the laboratory with low-wind-effect, and an on-sky demonstration of reconstruction of low-order modes consistent with those measured by the existing pyramid wavefront sensor, using SCExAO observations at the Subaru Telescope.  more » « less
Award ID(s):
2109231
NSF-PAR ID:
10463461
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ;
Editor(s):
Schmidt, Dirk; Schreiber, Laura; Vernet, Elise
Date Published:
Journal Name:
Adaptive Optics Systems VIII
Volume:
12185
Page Range / eLocation ID:
108
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Photonic lanterns (PLs) are tapered waveguides that gradually transition from a multimode fiber geometry to a bundle of single-mode fibers (SMFs). They can efficiently couple multimode telescope light into a multimode fiber entrance at the focal plane and convert it into multiple single-mode beams. Thus, each SMF samples its unique mode (lantern principal mode) of the telescope light in the pupil, analogous to subapertures in aperture masking interferometry (AMI). Coherent imaging with PLs can be enabled by the interference of SMF outputs and applying phase modulation, which can be achieved using a photonic chip beam combiner at the backend (e.g., the ABCD beam combiner). In this study, we investigate the potential of coherent imaging by the interference of SMF outputs of a PL with a single telescope. We demonstrate that the visibilities that can be measured from a PL are mutual intensities incident on the pupil weighted by the cross correlation of a pair of lantern modes. From numerically simulated lantern principal modes of a 6-port PL, we find that interferometric observables using a PL behave similarly to separated-aperture visibilities for simple models on small angular scales (<λ/D) but with greater sensitivity to symmetries and capability to break phase angle degeneracies. Furthermore, we present simulated observations with wave front errors (WFEs) and compare them to AMI. Despite the redundancy caused by extended lantern principal modes, spatial filtering offers stability to WFEs. Our simulated observations suggest that PLs may offer significant benefits in the photon-noise-limited regime and in resolving small angular scales at the low-contrast regime.

     
    more » « less
  2. We report the experimental demonstration of an optical differentiation wavefront sensor (ODWS) based on binary pixelated linear and nonlinear amplitude filtering in the far-field. We trained and tested a convolutional neural network that reconstructs the spatial phase map from nonlinear-filter-based ODWS data for which an analytic reconstruction algorithm is not available. It shows accurate zonal retrieval over different magnitudes of wavefronts and on randomly shaped wavefronts. This work paves the way for the implementation of simultaneously sensitive, high dynamic range, and high-resolution wavefront sensing.

     
    more » « less
  3. Light transport in a highly multimode fiber exhibits complex behavior in space, time, frequency, and polarization, especially in the presence of mode coupling. The newly developed techniques of spatial wavefront shaping turn out to be highly suitable to harness such enormous complexity: a spatial light modulator enables precise characterization of field propagation through a multimode fiber, and by adjusting the incident wavefront it can accurately tailor the transmitted spatial pattern, temporal profile, and polarization state. This unprecedented control leads to multimode fiber applications in imaging, endoscopy, optical trapping, and microfabrication. Furthermore, the output speckle pattern from a multimode fiber encodes spatial, temporal, spectral, and polarization properties of the input light, allowing such information to be retrieved from spatial measurements only. This article provides an overview of recent advances and breakthroughs in controlling light propagation in multimode fibers, and discusses newly emerging applications.

     
    more » « less
  4. We present numerical characterizations of the wavefront sensing performance for few-mode photonic lantern wavefront sensors (PLWFSs). These characterizations include calculations of the throughput, control space, sensor linearity, and an estimate of the maximum linear reconstruction range for standard and hybrid lanterns with between 3 and 19 ports, atλ=1550nm. We additionally consider the impact of beam-shaping optics and a charge-1 vortex mask placed in the pupil plane. The former is motivated by the application of PLs to high-resolution spectroscopy, which could enable efficient injection into the spectrometer along with simultaneous focal-plane wavefront sensing; similarly, the latter is motivated by the application of PLs to vortex fiber nulling (VFN), which can simultaneously enable wavefront sensing and the nulling of on-axis starlight. Overall, we find that the PLWFS setups tested in this work exhibit good linearity out to ∼0.25−0.5 radians of RMS wavefront error (WFE). Meanwhile, we estimate the maximum amount of WFE that can be handled by these sensors to be around ∼1−2 radians RMS before the sensor response becomes degenerate. In the future, we expect these limits can be pushed further by increasing the number of degrees of freedom, either by adopting higher mode-count lanterns, dispersing lantern outputs, or separating polarizations. Finally, we consider optimization strategies for the design of the PLWFS, which involve both modification of the lantern itself and the use of pre- and post-lantern optics like phase masks and interferometric beam recombiners.

     
    more » « less
  5. Typical methods to holographically encode arbitrary wavefronts assume the hologram medium only applies either phase shifts or amplitude attenuation to the wavefront. In many cases, phase cannot be introduced to the wavefront without also affecting the amplitude. Here we show how to encode an arbitrary wavefront into an off-axis transmission hologram that returns the exact desired arbitrary wavefunction in a diffracted beam for phase-only, amplitude-only, or mixed phase and amplitude holograms with any periodic groove profile. We apply this to design thin holograms for electrons in a TEM, but our results are generally applicable to light and X-ray optics. We employ a phase reconstruction from a series of focal plane images to qualitatively show the accuracy of this method to impart the expected amplitude and phase to a specific diffraction order.

     
    more » « less