Time-frequency (TF) filtering of analog signals has played a crucial role in the development of radio-frequency communications and is currently being recognized as an essential capability for communications, both classical and quantum, in the optical frequency domain. How best to design optical time-frequency (TF) filters to pass a targeted temporal mode (TM), and to reject background (noise) photons in the TF detection window? The solution for ‘coherent’ TF filtering is known—the quantum pulse gate—whereas the conventional, more common method is implemented by a sequence of incoherent spectral filtering and temporal gating operations. To compare these two methods, we derive a general formalism for two-stage incoherent time-frequency filtering, finding expressions for signal pulse transmission efficiency, and for the ability to discriminate TMs, which allows the blocking of unwanted background light. We derive the tradeoff between efficiency and TM discrimination ability, and find a remarkably concise relation between these two quantities and the time-bandwidth product of the combined filters. We apply the formalism to two examples—rectangular filters or Gaussian filters—both of which have known orthogonal-function decompositions. The formalism can be applied to any state of light occupying the input temporal mode, e.g., ‘classical’ coherent-state signals or pulsed single-photon states of light. In contrast to the radio-frequency domain, where coherent detection is standard and one can use coherent matched filtering to reject noise, in the optical domain direct detection is optimal in a number of scenarios where the signal flux is extremely small. Our analysis shows how the insertion loss and SNR change when one uses incoherent optical filters to reject background noise, followed by direct detection, e.g. photon counting. We point out implications in classical and quantum optical communications. As an example, we study quantum key distribution, wherein strong rejection of background noise is necessary to maintain a high quality of entanglement, while high signal transmission is needed to ensure a useful key generation rate.
Photonics provides a promising approach for image processing by spatial filtering, with the advantage of faster speeds and lower power consumption compared to electronic digital solutions. However, traditional optical spatial filters suffer from bulky form factors that limit their portability. Here we present a new approach based on pixel arrays of plasmonic directional image sensors, designed to selectively detect light incident along a small, geometrically tunable set of directions. The resulting imaging systems can function as optical spatial filters without any external filtering elements, leading to extreme size miniaturization. Furthermore, they offer the distinct capability to perform multiple filtering operations at the same time, through the use of sensor arrays partitioned into blocks of adjacent pixels with different angular responses. To establish the image processing capabilities of these devices, we present a rigorous theoretical model of their filter transfer function under both coherent and incoherent illumination. Next, we use the measured angle-resolved responsivity of prototype devices to demonstrate two examples of relevant functionalities: (1) the visualization of otherwise invisible phase objects and (2) spatial differentiation with incoherent light. These results are significant for a multitude of imaging applications ranging from microscopy in biomedicine to object recognition for computer vision.
more » « less- Award ID(s):
- 1711156
- PAR ID:
- 10375475
- Publisher / Repository:
- Optical Society of America
- Date Published:
- Journal Name:
- Optics Express
- Volume:
- 30
- Issue:
- 16
- ISSN:
- 1094-4087; OPEXFF
- Format(s):
- Medium: X Size: Article No. 29074
- Size(s):
- Article No. 29074
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Angle-sensitive photodetectors are a promising device technology for many advanced imaging functionalities, including lensless compound-eye vision, lightfield sensing, optical spatial filtering, and phase imaging. Here we demonstrate the use of plasmonic gradient metasurfaces to tailor the angular response of generic planar photodetectors. The resulting devices rely on the phase-matched coupling of light incident at select geometrically tunable angles into guided plasmonic modes, which are then scattered and absorbed in the underlying photodetector active layer. This approach naturally introduces sharp peaks in the angular response, with smaller footprint and reduced guided-mode radiative losses (and therefore improved spatial resolution and sensitivity) compared to analogous devices based on diffractive coupling. More broadly, these results highlight a promising new application space of flat optics, where gradient metasurfaces are integrated within image sensors to enable unconventional capabilities with enhanced system miniaturization and design flexibility.
-
Active imaging and structured illumination originated in “bulk” optical systems: free-space beams controlled with lenses, spatial light modulators, gratings, and mirrors to structure the optical diffraction and direct the beams onto the target. Recently, optical phased arrays have been developed with the goal of replacing traditional bulk active imaging systems with integrated optical systems. In this paper, we demonstrate the first array of optical phased arrays forming a composite aperture. This composite aperture is used to implement a Fourier-based structured-illumination imaging system, where moving fringe patterns are projected on a target and a single integrating detector is used to reconstruct the spatial structure of the target from the time variation of the back-scattered light. We experimentally demonstrate proof-of-concept Fourier-basis imaging in 1D using a six-element array of optical phased arrays, which interfere pairwise to sample up to 11 different spatial Fourier components, and reconstruct a 1D delta-function target. This concept addresses a key complexity constraint in scaling up integrated photonic apertures by requiring only
elements in a sparse array to produce an image with resolvable spots. -
Abstract Multidimensional photography can capture optical fields beyond the capability of conventional image sensors that measure only two-dimensional (2D) spatial distribution of light. By mapping a high-dimensional datacube of incident light onto a 2D image sensor, multidimensional photography resolves the scene along with other information dimensions, such as wavelength and time. However, the application of current multidimensional imagers is fundamentally restricted by their static optical architectures and measurement schemes—the mapping relation between the light datacube voxels and image sensor pixels is fixed. To overcome this limitation, we propose tunable multidimensional photography through active optical mapping. A high-resolution spatial light modulator, referred to as an active optical mapper, permutes and maps the light datacube voxels onto sensor pixels in an arbitrary and programmed manner. The resultant system can readily adapt the acquisition scheme to the scene, thereby maximising the measurement flexibility. Through active optical mapping, we demonstrate our approach in two niche implementations: hyperspectral imaging and ultrafast imaging.