skip to main content


Search for: All records

Creators/Authors contains: "Wang, Minqi"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Many extended reality (XR) devices present different views to the left and right eyes. Unwanted colorimetric differences between these views can cause perceptual artifacts that degrade binocular image quality. We present an image‐computable model designed to predict the appearance of binocular views with colorimetric differences in XR displays. The model is fitted to data from a recent perceptual study in which people provided multidimensional responses about the appearance of stimuli simulating an optical see‐through augmented reality device with interocular intensity differences. This work can be used to create preliminary assessments of binocular artifact appearance and inform XR display design. 
    more » « less
    Free, publicly-accessible full text available June 1, 2025
  2. Augmented reality (AR) devices seek to create compelling visual experiences that merge virtual imagery with the natural world. These devices often rely on wearable near-eye display systems that can optically overlay digital images to the left and right eyes of the user separately. Ideally, the two eyes should be shown images with minimal radiometric differences (e.g., the same overall luminance, contrast, and color in both eyes), but achieving this binocular equality can be challenging in wearable systems with stringent demands on weight and size. Basic vision research has shown that a spectrum of potentially detrimental perceptual effects can be elicited by imagery with radiometric differences between the eyes, but it is not clear whether and how these findings apply to the experience of modern AR devices. In this work, we first develop a testing paradigm for assessing multiple aspects of visual appearance at once, and characterize five key perceptual factors when participants viewed stimuli with interocular contrast differences. In a second experiment, we simulate optical see-through AR imagery using conventional desktop LCD monitors and use the same paradigm to evaluate the multi-faceted perceptual implications when the AR display luminance differs between the two eyes. We also include simulations of monocular AR systems (i.e., systems in which only one eye sees the displayed image). Our results suggest that interocular contrast differences can drive several potentially detrimental perceptual effects in binocular AR systems, such as binocular luster, rivalry, and spurious depth differences. In addition, monocular AR displays tend to have more artifacts than binocular displays with a large contrast difference in the two eyes. A better understanding of the range and likelihood of these perceptual phenomena can help inform design choices that support high-quality user experiences in AR. 
    more » « less
  3. SIGNIFICANCE

    Motion perception is an essential part of visual function. Understanding how people with low vision perceive motion can therefore inform rehabilitation strategies and assistive technology. Our study introduces the notion of Bayesian biases in motion perception and suggests that some people with low vision are susceptible to these systematic misperceptions.

    PURPOSE

    We aimed to develop a paradigm that can efficiently characterize motion percepts in people with low vision and compare their responses with well-known misperceptions made by people with typical vision when targets are hard to see.

    METHODS

    We recruited a small cohort of individuals with reduced acuity and contrast sensitivity (n = 5) as well as a comparison cohort with typical vision (n = 5) to complete a psychophysical study. Study participants were asked to judge the motion direction of a tilted rhombus that was either high or low contrast. In a series of trials, the rhombus oscillated vertically, horizontally, or diagonally. Participants indicated the perceived motion direction using a number wheel with 12 possible directions, and statistical tests were used to examine response biases.

    RESULTS

    All participants with typical vision showed systematic misperceptions well predicted by a Bayesian inference model. Specifically, their perception of vertical or horizontal motion was biased toward directions orthogonal to the long axis of the rhombus. They had larger biases for hard-to-see (low contrast) stimuli. Two participants with low vision had a similar bias, but with no difference between high- and low-contrast stimuli. The other participants with low vision were unbiased in their percepts or biased in the opposite direction.

    CONCLUSIONS

    Our results suggest that some people with low vision may misperceive motion in a systematic way similar to people with typical vision. However, we observed large individual differences. Future work will aim to uncover reasons for such differences and identify aspects of vision that predict susceptibility.

     
    more » « less
  4. Near-eye display systems for augmented reality (AR) aim to seamlessly merge virtual content with the user’s view of the real-world. A substantial limitation of current systems is that they only present virtual content over a limited portion of the user’s natural field of view (FOV). This limitation reduces the immersion and utility of these systems. Thus, it is essential to quantify FOV coverage in AR systems and understand how to maximize it. It is straightforward to determine the FOV coverage for monocular AR systems based on the system architecture. However, stereoscopic AR systems that present 3D virtual content create a more complicated scenario because the two eyes’ views do not always completely overlap. The introduction of partial binocular overlap in stereoscopic systems can potentially expand the perceived horizontal FOV coverage, but it can also introduce perceptual nonuniformity artifacts. In this arrticle, we first review the principles of binocular FOV overlap for natural vision and for stereoscopic display systems. We report the results of a set of perceptual studies that examine how different amounts and types of horizontal binocular overlap in stereoscopic AR systems influence the perception of nonuniformity across the FOV. We then describe how to quantify the horizontal FOV in stereoscopic AR when taking 3D content into account. We show that all stereoscopic AR systems result in a variable horizontal FOV coverage and variable amounts of binocular overlap depending on fixation distance. Taken together, these results provide a framework for optimizing perceived FOV coverage and minimizing perceptual artifacts in stereoscopic AR systems for different use cases. 
    more » « less
  5. Optogenetics has emerged as an exciting tool for manipulating neural activity, which in turn, can modulate behavior in live organisms. However, detecting the response to the optical stimulation requires electrophysiology with physical contact or fluorescent imaging at target locations, which is often limited by photobleaching and phototoxicity. In this paper, we show that phase imaging can report the intracellular transport induced by optogenetic stimulation. We developed a multimodal instrument that can both stimulate cells with subcellular spatial resolution and detect optical pathlength (OPL) changes with nanometer scale sensitivity. We found that OPL fluctuations following stimulation are consistent with active organelle transport. Furthermore, the results indicate a broadening in the transport velocity distribution, which is significantly higher in stimulated cells compared to optogenetically inactive cells. It is likely that this label‐free, contactless measurement of optogenetic response will provide an enabling approach to neuroscience.

     
    more » « less