Afocal telescopes are often used as foreoptics to existing imaging systems to allow for application flexibility. To properly combine an afocal telescope with an existing imaging system, the exit pupil of the afocal telescope and the entrance pupil of the imaging system must be coincident. Additionally, the exit pupil of the afocal telescope must be well-formed; that is, it must be the correct size and shape to mitigate pupil-matching challenges. This work introduces processes for designing freeform afocal telescopes with an emphasis on understanding how to analyze and control the exit pupil quality of such systems. The included 3-mirror design examples demonstrate the advantages of using freeform surfaces in afocal systems and quantify the tradeoffs required to improve the exit pupil quality.
more »
« less
Controlling first-order pupil location
We consider the requirements for first-order pupil location control using the matrix method for both finite-conjugate systems and afocal systems at infinite imaging conjugates. We show that two-element systems allow for only limited pupil location control, while with three elements or more the first-order pupil locations can be freely and independently controlled.
more »
« less
- PAR ID:
- 10518276
- Editor(s):
- Aikens, David M; Rehn, Henning; Thibault, Simon; Uhlendorf, Kristina
- Publisher / Repository:
- SPIE
- Date Published:
- Journal Name:
- SPIE
- ISBN:
- 9781510668508
- Page Range / eLocation ID:
- 51
- Format(s):
- Medium: X
- Location:
- Quebec City, Canada
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This paper presents the analytical form of the intrinsic aberration coefficients for spherical plane-symmetric optical systems expressed as a function of first-order system parameters and the paraxial chief and marginal ray angles and heights. The derived aberration coefficients are in the third and fourth groups with the multiplication of two or three vector products of pupil and field vectors.more » « less
-
Holographic displays promise to deliver unprecedented display capabilities in augmented reality applications, featuring a wide field of view, wide color gamut, spatial resolution, and depth cues all in a compact form factor. While emerging holographic display approaches have been successful in achieving large étendue and high image quality as seen by a camera, the large étendue also reveals a problem that makes existing displays impractical: the sampling of the holographic field by the eye pupil. Existing methods have not investigated this issue due to the lack of displays with large enough étendue, and, as such, they suffer from severe artifacts with varying eye pupil size and location. We show that the holographic field as sampled by the eye pupil is highly varying for existing display setups, and we propose pupil-aware holography that maximizes the perceptual image quality irrespective of the size, location, and orientation of the eye pupil in a near-eye holographic display. We validate the proposed approach both in simulations and on a prototype holographic display and show that our method eliminates severe artifacts and significantly outperforms existing approaches.more » « less
-
Abstract The direct imaging of an Earth-like exoplanet will require sub-nanometric wave-front control across large light-collecting apertures to reject host starlight and detect the faint planetary signal. Current adaptive optics systems, which use wave-front sensors that reimage the telescope pupil, face two challenges that prevent this level of control: non-common-path aberrations, caused by differences between the sensing and science arms of the instrument; and petaling modes: discontinuous phase aberrations caused by pupil fragmentation, especially relevant for the upcoming 30 m class telescopes. Such aberrations drastically impact the capabilities of high-contrast instruments. To address these issues, we can add a second-stage wave-front sensor to the science focal plane. One promising architecture uses the photonic lantern (PL): a waveguide that efficiently couples aberrated light into single-mode fibers (SMFs). In turn, SMF-confined light can be stably injected into high-resolution spectrographs, enabling direct exoplanet characterization and precision radial velocity measurements; simultaneously, the PL can be used for focal-plane wave-front sensing. We present a real-time experimental demonstration of the PL wave-front sensor on the Subaru/SCExAO testbed. Our system is stable out to around ±400 nm of low-order Zernike wave-front error and can correct petaling modes. When injecting ∼30 nm rms of low-order time-varying error, we achieve ∼10× rejection at 1 s timescales; further refinements to the control law and lantern fabrication process should make sub-nanometric wave-front control possible. In the future, novel sensors like the PL wave-front sensor may prove to be critical in resolving the wave-front control challenges posed by exoplanet direct imaging.more » « less
-
Eye tracking has already made its way to current commercial wearable display devices, and is becoming increasingly important for virtual and augmented reality applications. However, the existing model-based eye tracking solutions are not capable of conducting very accurate gaze angle measurements, and may not be sufficient to solve challenging display problems such as pupil steering or eyebox expansion. In this paper, we argue that accurate detection and localization of pupil in 3D space is a necessary intermediate step in model-based eye tracking. Existing methods and datasets either ignore evaluating the accuracy of 3D pupil localization or evaluate it only on synthetic data. To this end, we capture the first 3D pupilgaze-measurement dataset using a high precision setup with head stabilization and release it as the first benchmark dataset to evaluate both 3D pupil localization and gaze tracking methods. Furthermore, we utilize an advanced eye model to replace the commonly used oversimplified eye model. Leveraging the eye model, we propose a novel 3D pupil localization method with a deep learning-based corneal refraction correction. We demonstrate that our method outperforms the state-of-the-art works by reducing the 3D pupil localization error by 47.5% and the gaze estimation error by 18.7%. Our dataset and codes can be found here: link.more » « less
An official website of the United States government
