An image mapping spectrometer (IMS) is a snapshot hyperspectral imager that simultaneously captures both the spatial (
In this Letter, we present a snapshot hyperspectral light field imaging system using a single camera. By integrating an unfocused light field camera with a snapshot hyperspectral imager, the image mapping spectrometer, we captured a five-dimensional (5D) (
- NSF-PAR ID:
- 10132588
- Publisher / Repository:
- Optical Society of America
- Date Published:
- Journal Name:
- Optics Letters
- Volume:
- 45
- Issue:
- 3
- ISSN:
- 0146-9592; OPLEDP
- Page Range / eLocation ID:
- Article No. 772
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
, ) and spectral ( ) information of incoming light. The IMS maps a three-dimensional (3D) datacube ( , , ) to a two-dimensional (2D) detector array ( , ) for parallel measurement. To reconstruct the original 3D datacube, one must construct a lookup table that connects voxels in the datacube and pixels in the raw image. Previous calibration methods suffer from either low speed or poor image quality. We herein present a slit-scan calibration method that can significantly reduce the calibration time while maintaining high accuracy. Moreover, we quantitatively analyzed the major artifact in the IMS, the striped image, and developed three numerical methods to correct for it. -
One of the top priorities in observational astronomy is the direct imaging and characterization of extrasolar planets (exoplanets) and planetary systems. Direct images of rocky exoplanets are of particular interest in the search for life beyond the Earth, but they tend to be rather challenging targets since they are orders-of-magnitude dimmer than their host stars and are separated by small angular distances that are comparable to the classical
diffraction limit, even for the coming generation of 30 m class telescopes. Current and planned efforts for ground-based direct imaging of exoplanets combine high-order adaptive optics (AO) with a stellar coronagraph observing at wavelengths ranging from the visible to the mid-IR. The primary barrier to achieving high contrast with current direct imaging methods is quasi-static speckles, caused largely by non-common path aberrations (NCPAs) in the coronagraph optical train. Recent work has demonstrated that millisecond imaging, which effectively “freezes” the atmosphere’s turbulent phase screens, should allow the wavefront sensor (WFS) telemetry to be used as a probe of the optical system to measure NCPAs. Starting with a realistic model of a telescope with an AO system and a stellar coronagraph, this paper provides simulations of several closely related regression models that take advantage of millisecond telemetry from the WFS and coronagraph’s science camera. The simplest regression model, called the naïve estimator, does not treat the noise and other sources of information loss in the WFS. Despite its flaws, in one of the simulations presented herein, the naïve estimator provides a useful estimate of an NCPA of radian RMS ( ), with an accuracy of radian RMS in 1 min of simulated sky time on a magnitude 8 star. The bias-corrected estimator generalizes the regression model to account for the noise and information loss in the WFS. A simulation of the bias-corrected estimator with 4 min of sky time included an NCPA ofradian RMS ( ) and an extended exoplanet scene. The joint regression of the bias-corrected estimator simultaneously achieved an NCPA estimate with an accuracy of radian RMS and an estimate of the exoplanet scene that was free of the self-subtraction artifacts typically associated with differential imaging. The contrast achieved by imaging of the exoplanet scene was at a distance of from the star and at . These contrast values are comparable to the very best on-sky results obtained from multi-wavelength observations that employ both angular differential imaging (ADI) and spectral differential imaging (SDI). This comparable performance is despite the fact that our simulations are quasi-monochromatic, which makes SDI impossible, nor do they have diurnal field rotation, which makes ADI impossible. The error covariance matrix of the joint regression shows substantial correlations in the exoplanet and NCPA estimation errors, indicating that exoplanet intensity and NCPA need to be estimated self-consistently to achieve high contrast. -
An optical parametric oscillator (OPO) is developed and characterized for the simultaneous generation of ultraviolet (UV) and near-UV nanosecond laser pulses for the single-shot Rayleigh scattering and planar laser-induced-fluorescence (PLIF) imaging of methylidyne (CH) and nitric oxide (NO) in turbulent flames. The OPO is pumped by a multichannel, 8-pulse Nd:YAG laser cluster that produces up to 225 mJ/pulse at 355 nm with pulse spacing of 100 µs. The pulsed OPO has a conversion efficiency of 9.6% to the signal wavelength of
when pumped by the multimode laser. Second harmonic conversion of the signal, with 3.8% efficiency, is used for the electronic excitation of the A-X (1,0) band of NO at , while the residual signal at 430 nm is used for direct excitation of the A-X (0,0) band of the CH radical and elastic Rayleigh scattering. The section of the OPO signal wavelength for simultaneous CH and NO PLIF imaging is performed with consideration of the pulse energy, interference from the reactant and product species, and the fluorescence signal intensity. The excitation wavelengths of 430.7 nm and 215.35 nm are studied in a laminar, premixed –air flame. Single-shot CH and NO PLIF and Rayleigh scatter imaging is demonstrated in a turbulent diffusion flame using a high-speed intensified CMOS camera. Analysis of the complementary Rayleigh scattering and CH and NO PLIF enables identification and quantification of the high-temperature flame layers, the combustion product zones, and the fuel-jet core. Considerations for extension to simultaneous, 10-kHz-rate acquisition are discussed. -
Electro-optic quantum coherent interfaces map the amplitude and phase of a quantum signal directly to the phase or intensity of a probe beam. At terahertz frequencies, a fundamental challenge is not only to sense such weak signals (due to a weak coupling with a probe in the near-infrared) but also to resolve them in the time domain. Cavity confinement of both light fields can increase the interaction and achieve strong coupling. Using this approach, current realizations are limited to low microwave frequencies. Alternatively, in bulk crystals, electro-optic sampling was shown to reach quantum-level sensitivity of terahertz waves. Yet, the coupling strength was extremely weak. Here, we propose an on-chip architecture that concomitantly provides subcycle temporal resolution and an extreme sensitivity to sense terahertz intracavity fields below 20 V/m. We use guided femtosecond pulses in the near-infrared and a confinement of the terahertz wave to a volume of
in combination with ultraperformant organic molecules ( ) and accomplish a record-high single-photon electro-optic coupling rate of , 10,000 times higher than in recent reports of sensing vacuum field fluctuations in bulk media. Via homodyne detection implemented directly on chip, the interaction results into an intensity modulation of the femtosecond pulses. The single-photon cooperativity is , and the multiphoton cooperativity is at room temperature. We show dynamic range in intensity at 500 ms integration under irradiation with a weak coherent terahertz field. Similar devices could be employed in future measurements of quantum states in the terahertz at the standard quantum limit, or for entanglement of subsystems on subcycle temporal scales, such as terahertz and near-infrared quantum bits. -
A lens performs an approximately one-to-one mapping from the object to the image plane. This mapping in the image plane is maintained within a depth of field (or referred to as depth of focus, if the object is at infinity). This necessitates refocusing of the lens when the images are separated by distances larger than the depth of field. Such refocusing mechanisms can increase the cost, complexity, and weight of imaging systems. Here we show that by judicious design of a multi-level diffractive lens (MDL) it is possible to drastically enhance the depth of focus by over 4 orders of magnitude. Using such a lens, we are able to maintain focus for objects that are separated by as large a distance as
in our experiments. Specifically, when illuminated by collimated light at , the MDL produced a beam, which remained in focus from 5 to 1200 mm. The measured full width at half-maximum of the focused beam varied from 6.6 µm (5 mm away from the MDL) to 524 µm (1200 mm away from the MDL). Since the side lobes were well suppressed and the main lobe was close to the diffraction limit, imaging with a horizontal × vertical field of view of over the entire focal range was possible. This demonstration opens up a new direction for lens design, where by treating the phase in the focal plane as a free parameter, extreme-depth-of-focus imaging becomes possible.