Abstract There is a long history of using angle sensors to measure wavefront. The best example is the Shack-Hartmann sensor. Compared to other methods of wavefront sensing, angle-based approach is more broadly used in industrial applications and scientific research. Its wide adoption is attributed to its fully integrated setup, robustness, and fast speed. However, there is a long-standing issue in its low spatial resolution, which is limited by the size of the angle sensor. Here we report a angle-based wavefront sensor to overcome this challenge. It uses ultra-compact angle sensor built from flat optics. It is directly integrated on focal plane array. This wavefront sensor inherits all the benefits of the angle-based method. Moreover, it improves the spatial sampling density by over two orders of magnitude. The drastically improved resolution allows angle-based sensors to be used for quantitative phase imaging, enabling capabilities such as video-frame recording of high-resolution surface topography.
more »
« less
WISH: wavefront imaging sensor with high resolution
Abstract Wavefront sensing is the simultaneous measurement of the amplitude and phase of an incoming optical field. Traditional wavefront sensors such as Shack-Hartmann wavefront sensor (SHWFS) suffer from a fundamental tradeoff between spatial resolution and phase estimation and consequently can only achieve a resolution of a few thousand pixels. To break this tradeoff, we present a novel computational-imaging-based technique, namely, the Wavefront Imaging Sensor with High resolution (WISH). We replace the microlens array in SHWFS with a spatial light modulator (SLM) and use a computational phase-retrieval algorithm to recover the incident wavefront. This wavefront sensor can measure highly varying optical fields at more than 10-megapixel resolution with the fine phase estimation. To the best of our knowledge, this resolution is an order of magnitude higher than the current noninterferometric wavefront sensors. To demonstrate the capability of WISH, we present three applications, which cover a wide range of spatial scales. First, we produce the diffraction-limited reconstruction for long-distance imaging by combining WISH with a large-aperture, low-quality Fresnel lens. Second, we show the recovery of high-resolution images of objects that are obscured by scattering. Third, we show that WISH can be used as a microscope without an objective lens. Our study suggests that the designing principle of WISH, which combines optical modulators and computational algorithms to sense high-resolution optical fields, enables improved capabilities in many existing applications while revealing entirely new, hitherto unexplored application areas.
more »
« less
- Award ID(s):
- 1652633
- PAR ID:
- 10153497
- Publisher / Repository:
- Nature Publishing Group
- Date Published:
- Journal Name:
- Light: Science & Applications
- Volume:
- 8
- Issue:
- 1
- ISSN:
- 2047-7538
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Diffraction-limited optical imaging through scattering media has the potential to transform many applications such as airborne and space-based imaging (through the atmosphere), bioimaging (through skin and human tissue), and fiber-based imaging (through fiber bundles). Existing wavefront shaping methods can image through scattering media and other obscurants by optically correcting wavefront aberrations using high-resolution spatial light modulators—but these methods generally require (i) guidestars, (ii) controlled illumination, (iii) point scanning, and/or (iv) statics scenes and aberrations. We propose neural wavefront shaping (NeuWS), a scanning-free wavefront shaping technique that integrates maximum likelihood estimation, measurement modulation, and neural signal representations to reconstruct diffraction-limited images through strong static and dynamic scattering media without guidestars, sparse targets, controlled illumination, nor specialized image sensors. We experimentally demonstrate guidestar-free, wide field-of-view, high-resolution, diffraction-limited imaging of extended, nonsparse, and static/dynamic scenes captured through static/dynamic aberrations.more » « less
-
Acousto-optic imaging (AOI) enables optical-contrast imaging deep inside scattering samples via localized ultrasound-modulation of scattered light. While AOI allows optical investigations at depths, its imaging resolution is inherently limited by the ultrasound wavelength, prohibiting microscopic investigations. Here, we propose a computational imaging approach that allows optical diffraction-limited imaging using a conventional AOI system. We achieve this by extracting diffraction-limited imaging information from speckle correlations in the conventionally detected ultrasound-modulated scattered-light fields. Specifically, we identify that since “memory-effect” speckle correlations allow estimation of the Fourier magnitude of the field inside the ultrasound focus, scanning the ultrasound focus enables robust diffraction-limited reconstruction of extended objects using ptychography (i.e., we exploit the ultrasound focus as the scanned spatial-gate probe required for ptychographic phase retrieval). Moreover, we exploit the short speckle decorrelation-time in dynamic media, which is usually considered a hurdle for wavefront-shaping- based approaches, for improved ptychographic reconstruction. We experimentally demonstrate noninvasive imaging of targets that extend well beyond the memory-effect range, with a 40-times resolution improvement over conventional AOI.more » « less
-
Grulkowski, Ireneusz (Ed.)Quantitative phase imaging (QPI) via Digital Holographic microscopy (DHM) has been widely applied in material and biological applications. The performance of DHM technologies relies heavily on computational reconstruction methods to provide accurate phase measurements. Among the optical configuration of the imaging system in DHM, imaging systems operating in a non-telecentric regime are the most common ones. Nonetheless, the spherical wavefront introduced by the non-telecentric DHM system must be compensated to provide undistorted phase measurements. The proposed reconstruction approach is based on previous work from Kemper’s group. Here, we have reformulated the problem, reducing the number of required parameters needed for reconstructing phase images to the sensor pixel size and source wavelength. The developed computational algorithm can be divided into six main steps. In the first step, the selection of the +1-diffraction order in the hologram spectrum. The interference angle is obtained from the selected +1 order. Secondly, the curvature of the spherical wavefront distorting the sample’s phase map is estimated by analyzing the size of the selected +1 order in the hologram’s spectrum. The third and fourth steps are the spatial filtering of the +1 order and the compensation of the interference angle. The next step involves the estimation of the center of the spherical wavefront. An optional final optimization step has been included to fine-tune the estimated parameters and provide fully compensated phase images. Because the proper implementation of a framework is critical to achieve successful results, we have explicitly described the steps, including functions and toolboxes, required for reconstructing phase images without distortions. As a result, we have provided open-access codes and a user interface tool with minimum user input to reconstruct holograms recorded in a non-telecentric DHM system.more » « less
-
Abstract Multidimensional photography can capture optical fields beyond the capability of conventional image sensors that measure only two-dimensional (2D) spatial distribution of light. By mapping a high-dimensional datacube of incident light onto a 2D image sensor, multidimensional photography resolves the scene along with other information dimensions, such as wavelength and time. However, the application of current multidimensional imagers is fundamentally restricted by their static optical architectures and measurement schemes—the mapping relation between the light datacube voxels and image sensor pixels is fixed. To overcome this limitation, we propose tunable multidimensional photography through active optical mapping. A high-resolution spatial light modulator, referred to as an active optical mapper, permutes and maps the light datacube voxels onto sensor pixels in an arbitrary and programmed manner. The resultant system can readily adapt the acquisition scheme to the scene, thereby maximising the measurement flexibility. Through active optical mapping, we demonstrate our approach in two niche implementations: hyperspectral imaging and ultrafast imaging.more » « less
An official website of the United States government
