Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
Near-eye display systems for augmented reality (AR) aim to seamlessly merge virtual content with the user’s view of the real-world. A substantial limitation of current systems is that they only present virtual content over a limited portion of the user’s natural field of view (FOV). This limitation reduces the immersion and utility of these systems. Thus, it is essential to quantify FOV coverage in AR systems and understand how to maximize it. It is straightforward to determine the FOV coverage for monocular AR systems based on the system architecture. However, stereoscopic AR systems that present 3D virtual content create a more complicated scenario because the two eyes’ views do not always completely overlap. The introduction of partial binocular overlap in stereoscopic systems can potentially expand the perceived horizontal FOV coverage, but it can also introduce perceptual nonuniformity artifacts. In this arrticle, we first review the principles of binocular FOV overlap for natural vision and for stereoscopic display systems. We report the results of a set of perceptual studies that examine how different amounts and types of horizontal binocular overlap in stereoscopic AR systems influence the perception of nonuniformity across the FOV. We then describe how to quantify the horizontal FOV in stereoscopic AR when taking 3D content into account. We show that all stereoscopic AR systems result in a variable horizontal FOV coverage and variable amounts of binocular overlap depending on fixation distance. Taken together, these results provide a framework for optimizing perceived FOV coverage and minimizing perceptual artifacts in stereoscopic AR systems for different use cases.more » « less
Optogenetics has emerged as an exciting tool for manipulating neural activity, which in turn, can modulate behavior in live organisms. However, detecting the response to the optical stimulation requires electrophysiology with physical contact or fluorescent imaging at target locations, which is often limited by photobleaching and phototoxicity. In this paper, we show that phase imaging can report the intracellular transport induced by optogenetic stimulation. We developed a multimodal instrument that can both stimulate cells with subcellular spatial resolution and detect optical pathlength (OPL) changes with nanometer scale sensitivity. We found that OPL fluctuations following stimulation are consistent with active organelle transport. Furthermore, the results indicate a broadening in the transport velocity distribution, which is significantly higher in stimulated cells compared to optogenetically inactive cells. It is likely that this label‐free, contactless measurement of optogenetic response will provide an enabling approach to neuroscience.