Raman microscopy is a powerful analytical technique for materials and life sciences that enables mapping the spatial distribution of the chemical composition of a sample. State-of-the-art Raman microscopes, based on point-scanning frequency-domain detection, have long (∼1s) pixel dwell times, making it challenging to acquire images of a significant area (e.g., 100×100µm). Here we present a compact wide-field Raman microscope based on a time-domain Fourier-transform approach, which enables parallel acquisition of the Raman spectra on all pixels of a 2D detector. A common-path birefringent interferometer with exceptional delay stability and reproducibility can rapidly acquire Raman maps (∼30min for a 250000pixel image) with high spatial (<1µm) and spectral (∼23cm−1) resolutions. Time-domain detection allows us to disentangle fluorescence and Raman signals, which can both be measured separately. We validate the system by Raman imaging plastic microbeads and demonstrate its multimodal operation by capturing fluorescence and Raman maps of a multilayer-WSe2sample, providing complementary information on the strain and number of layers of the material.
Spectral imaging techniques extract spectral information using dispersive elements in combination with optical microscopes. For rapid acquisition, multiplexing spectral information along one dimension of imaged pixels has been demonstrated in hyperspectral imaging, as well as in Raman and Brillouin imaging. Full-field spectroscopy, i.e., multiplexing where imaged pixels are collected in 2D simultaneously while spectral analysis is performed sequentially, can increase spectral imaging speed, but so far has been attained at low spectral resolutions. Here, we extend 2D multiplexing to high spectral resolutions of ∼80 MHz (∼0.0001 nm) using high-throughput spectral discrimination based on atomic transitions.
more » « less- Award ID(s):
- 1942003
- PAR ID:
- 10392581
- Publisher / Repository:
- Optical Society of America
- Date Published:
- Journal Name:
- Optics Express
- Volume:
- 31
- Issue:
- 3
- ISSN:
- 1094-4087; OPEXFF
- Format(s):
- Medium: X Size: Article No. 4334
- Size(s):
- Article No. 4334
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Diffractive optics have increasingly caught the attention of the scientific community. Classical diffractive optics are 2D diffractive optical elements (DOEs) and computer-generated holograms (CGHs), which modulate optical waves on a solitary transverse plane. However, potential capabilities are missed by the inherent two-dimensional nature of these devices. Previous work has demonstrated that extending the modulation from planar (2D) to volumetric (3D) enables new functionalities, such as generating space-variant functions, multiplexing in the spatial or spectral domain, or enhancing information capacity. Unfortunately, despite significant progress fueled by recent interest in metasurface diffraction, 3D diffractive optics still remains relatively unexplored. Here, we introduce the concept of azimuthal multiplexing. We propose, design, and demonstrate 3D diffractive optics showing this multiplexing effect. According to this new phenomenon, multiple pages of information are encoded and can be read out across independent channels by rotating one or more diffractive layers with respect to the others. We implement the concept with multilayer diffractive optical elements. An iterative projection optimization algorithm helps solve the inverse design problem. The experimental realization using photolithographically fabricated multilevel phase layers demonstrates the predicted performance. We discuss the limitations and potential of azimuthal multiplexing 3D diffractive optics.
-
Both 3D imaging and hyperspectral imaging provide important information of the scene and combining them is beneficial in helping us perceive and understand real-world structures. Previous hyperspectral 3D imaging systems typically require a hyperspectral imaging system as the detector suffers from complicated hardware design, high cost, and high acquisition and reconstruction time. Here, we report a low-cost, high-frame rate, simple-design, and compact hyperspectral stripe projector (HSP) system based on a single digital micro-mirror device, capable of producing hyperspectral patterns where each row of pixels has an independently programmable spectrum. We demonstrate two example applications using the HSP via hyperspectral structured illumination: hyperspectral 3D surface imaging and spectrum-dependent hyperspectral compressive imaging of volume density of participating medium. The hyperspectral patterns simultaneously encode the 3D spatial and spectral information of the target, requiring only a grayscale sensor as the detector. The reported HSP and its applications provide a solution for combining structured illumination techniques with hyperspectral imaging in a simple, efficient, and low-cost manner. The work presented here represents a novel structured illumination technique that provides the basis and inspiration of future variations of hardware systems and software encoding schemes.
-
Abstract Multidimensional photography can capture optical fields beyond the capability of conventional image sensors that measure only two-dimensional (2D) spatial distribution of light. By mapping a high-dimensional datacube of incident light onto a 2D image sensor, multidimensional photography resolves the scene along with other information dimensions, such as wavelength and time. However, the application of current multidimensional imagers is fundamentally restricted by their static optical architectures and measurement schemes—the mapping relation between the light datacube voxels and image sensor pixels is fixed. To overcome this limitation, we propose tunable multidimensional photography through active optical mapping. A high-resolution spatial light modulator, referred to as an active optical mapper, permutes and maps the light datacube voxels onto sensor pixels in an arbitrary and programmed manner. The resultant system can readily adapt the acquisition scheme to the scene, thereby maximising the measurement flexibility. Through active optical mapping, we demonstrate our approach in two niche implementations: hyperspectral imaging and ultrafast imaging.
-
null (Ed.)Holographic displays and computer-generated holography offer a unique opportunity in improving optical resolutions and depth characteristics of near-eye displays. The thermally-modulated Nanopho-tonic Phased Array (NPA), a new type of holographic display, affords several advantages, including integrated light source and higher refresh rates, over other holographic display technologies. However, the thermal phase modulation of the NPA makes it susceptible to the thermal proximity effect where heating one pixel affects the temperature of nearby pixels. Proximity effect correction (PEC) methods have been proposed for 2D Fourier holograms in the far field but not for Fresnel holograms at user-specified depths. Here we extend an existing PEC method for the NPA to Fresnel holograms with phase-only hologram optimization and validate it through computational simulations. Our method is not only effective in correcting the proximity effect for the Fresnel holograms of 2D images at desired depths but can also leverage the fast refresh rate of the NPA to display 3D scenes with time-division multiplexing.more » « less