skip to main content


Title: Development of a fast calibration method for image mapping spectrometry

An image mapping spectrometer (IMS) is a snapshot hyperspectral imager that simultaneously captures both the spatial (x,y) and spectral (λ<#comment/>) information of incoming light. The IMS maps a three-dimensional (3D) datacube (x,y,λ<#comment/>) to a two-dimensional (2D) detector array (x,y) for parallel measurement. To reconstruct the original 3D datacube, one must construct a lookup table that connects voxels in the datacube and pixels in the raw image. Previous calibration methods suffer from either low speed or poor image quality. We herein present a slit-scan calibration method that can significantly reduce the calibration time while maintaining high accuracy. Moreover, we quantitatively analyzed the major artifact in the IMS, the striped image, and developed three numerical methods to correct for it.

 
more » « less
NSF-PAR ID:
10168906
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
Optical Society of America
Date Published:
Journal Name:
Applied Optics
Volume:
59
Issue:
20
ISSN:
1559-128X; APOPAI
Page Range / eLocation ID:
Article No. 6062
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In this Letter, we present a snapshot hyperspectral light field imaging system using a single camera. By integrating an unfocused light field camera with a snapshot hyperspectral imager, the image mapping spectrometer, we captured a five-dimensional (5D) (x,y,u,v,λ<#comment/>) (x,y,spatial coordinates;u,v,emittance angles;λ<#comment/>,wavelength) datacube in a single camera exposure. The corresponding volumetric image (x,y,z) at each wavelength is then computed through a scale-depth space transform. We demonstrated the snapshot advantage of our system by imaging the spectral-volumetric scenes in real time.

     
    more » « less
  2. Compressed ultrafast photography (CUP) is a computational optical imaging technique that can capture transient dynamics at an unprecedented speed. Currently, the image reconstruction of CUP relies on iterative algorithms, which are time-consuming and often yield nonoptimal image quality. To solve this problem, we develop a deep-learning-based method for CUP reconstruction that substantially improves the image quality and reconstruction speed. A key innovation toward efficient deep learning reconstruction of a large three-dimensional (3D) event datacube (x,y,t) (x,y, spatial coordinate;t, time) is that we decompose the original datacube into massively parallel two-dimensional (2D) imaging subproblems, which are much simpler to solve by a deep neural network. We validated our approach on simulated and experimental data.

     
    more » « less
  3. A lens performs an approximately one-to-one mapping from the object to the image plane. This mapping in the image plane is maintained within a depth of field (or referred to as depth of focus, if the object is at infinity). This necessitates refocusing of the lens when the images are separated by distances larger than the depth of field. Such refocusing mechanisms can increase the cost, complexity, and weight of imaging systems. Here we show that by judicious design of a multi-level diffractive lens (MDL) it is possible to drastically enhance the depth of focus by over 4 orders of magnitude. Using such a lens, we are able to maintain focus for objects that are separated by as large a distance as∼<#comment/>6min our experiments. Specifically, when illuminated by collimated light atλ<#comment/>=0.85µ<#comment/>m, the MDL produced a beam, which remained in focus from 5 to 1200 mm. The measured full width at half-maximum of the focused beam varied from 6.6 µm (5 mm away from the MDL) to 524 µm (1200 mm away from the MDL). Since the side lobes were well suppressed and the main lobe was close to the diffraction limit, imaging with a horizontal × vertical field of view of40∘<#comment/>×<#comment/>30∘<#comment/>over the entire focal range was possible. This demonstration opens up a new direction for lens design, where by treating the phase in the focal plane as a free parameter, extreme-depth-of-focus imaging becomes possible.

     
    more » « less
  4. One of the top priorities in observational astronomy is the direct imaging and characterization of extrasolar planets (exoplanets) and planetary systems. Direct images of rocky exoplanets are of particular interest in the search for life beyond the Earth, but they tend to be rather challenging targets since they are orders-of-magnitude dimmer than their host stars and are separated by small angular distances that are comparable to the classicalλ<#comment/>/Ddiffraction limit, even for the coming generation of 30 m class telescopes. Current and planned efforts for ground-based direct imaging of exoplanets combine high-order adaptive optics (AO) with a stellar coronagraph observing at wavelengths ranging from the visible to the mid-IR. The primary barrier to achieving high contrast with current direct imaging methods is quasi-static speckles, caused largely by non-common path aberrations (NCPAs) in the coronagraph optical train. Recent work has demonstrated that millisecond imaging, which effectively “freezes” the atmosphere’s turbulent phase screens, should allow the wavefront sensor (WFS) telemetry to be used as a probe of the optical system to measure NCPAs. Starting with a realistic model of a telescope with an AO system and a stellar coronagraph, this paper provides simulations of several closely related regression models that take advantage of millisecond telemetry from the WFS and coronagraph’s science camera. The simplest regression model, called the naïve estimator, does not treat the noise and other sources of information loss in the WFS. Despite its flaws, in one of the simulations presented herein, the naïve estimator provides a useful estimate of an NCPA of∼<#comment/>0.5radian RMS (≈<#comment/>λ<#comment/>/13), with an accuracy of∼<#comment/>0.06radian RMS in 1 min of simulated sky time on a magnitude 8 star. Thebias-corrected estimatorgeneralizes the regression model to account for the noise and information loss in the WFS. A simulation of the bias-corrected estimator with 4 min of sky time included an NCPA of∼<#comment/>0.05radian RMS (≈<#comment/>λ<#comment/>/130) and an extended exoplanet scene. The joint regression of the bias-corrected estimator simultaneously achieved an NCPA estimate with an accuracy of∼<#comment/>5×<#comment/>10−<#comment/>3radian RMS and an estimate of the exoplanet scene that was free of the self-subtraction artifacts typically associated with differential imaging. The5σ<#comment/>contrast achieved by imaging of the exoplanet scene was∼<#comment/>1.7×<#comment/>10−<#comment/>4at a distance of3λ<#comment/>/Dfrom the star and∼<#comment/>2.1×<#comment/>10−<#comment/>5at10λ<#comment/>/D. These contrast values are comparable to the very best on-sky results obtained from multi-wavelength observations that employ both angular differential imaging (ADI) and spectral differential imaging (SDI). This comparable performance is despite the fact that our simulations are quasi-monochromatic, which makes SDI impossible, nor do they have diurnal field rotation, which makes ADI impossible. The error covariance matrix of the joint regression shows substantial correlations in the exoplanet and NCPA estimation errors, indicating that exoplanet intensity and NCPA need to be estimated self-consistently to achieve high contrast.

     
    more » « less
  5. We report a method to generate angularly polarized vector beams with a topological charge of one by rotating air holes to form two-dimensional photonic crystal (PC) cavities. The mode volume and resonance wavelength of these cavities are tuned from0.33(λ<#comment/>/n)3to12(λ<#comment/>/n)3and in a wide range of 400 nm, respectively, by controlling the range of fixed air holes near the center of the structure. As a benefit, the half-maximum divergence angles of the vector beam can be widely changed from 90° to∼<#comment/>60∘<#comment/>. By adjusting the shift direction of the air holes in the PC cavities, optical vector beams with different far-field morphology are obtained. The scheme provides not only an alternative method to generate optical vector beams, but also an effective strategy to control far-field morphology and polarizations, which holds promising applications such as optical microscopy and micro-manipulation.

     
    more » « less