skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Chen Y."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Spatial navigation involves the use of various cues. This study examined how cue conflict influences navigation by contrasting landmarks and optic flow. Participants estimated spatial distances under different levels of cue conflict: minimal conflict, large conflict, and large conflict with explicit awareness of landmark instability. Whereas increased cue conflict alone had little behavioral impact, adding explicit awareness reduced reliance on landmarks and impaired the precision of spatial localization based on them. To understand the underlying mechanisms, we tested two cognitive models: a Bayesian causal inference (BCI) model and a non-Bayesian sensory disparity model. The BCI model provided a better fit to the data, revealing two independent mechanisms for reduced landmark reliance: increased sensory noise for unstable landmarks and lower weighting of unstable landmarks when landmarks and optic flow were judged to originate from different causes. Surprisingly, increased cue conflict did not decrease the prior belief in a common cause, even when explicit awareness of landmark instability was imposed. Additionally, cue weighting in the same-cause judgment was determined by bottom-up sensory reliability, while in the different-cause judgment, it correlated with participants’ subjective evaluation of cue quality, suggesting a top-down metacognitive influence. The BCI model further identified key factors contributing to suboptimal cue combination in minimal cue conflicts, including the prior belief in a common cause and prior knowledge of the target location. Together, these findings provide critical insights into how navigators resolve conflicting spatial cues and highlight the utility of the BCI model in dissecting cue interaction mechanisms in navigation. 
    more » « less
    Free, publicly-accessible full text available May 9, 2026
  2. Free, publicly-accessible full text available January 10, 2026
  3. Free, publicly-accessible full text available December 1, 2025
  4. In this work, we introduce SEESys, the first system to provide online pose error estimation for Simultaneous Localization and Mapping (SLAM). Unlike prior offline error estimation approaches, the SEESys framework efficiently collects real-time system features and delivers accurate pose error magnitude estimates with low latency. This enables real-time quality-of-service information for downstream applications. To achieve this goal, we develop a SLAM system run-time status monitor (RTS monitor) that performs feature collection with minimal overhead, along with a multi-modality attention-based Deep SLAM Error Estimator (DeepSEE) for error estimation. We train and evaluate SEESys using both public SLAM benchmarks and a diverse set of synthetic datasets, achieving an RMSE of 0.235 cm of pose error estimation, which is 15.8% lower than the baseline. Additionally, we conduct a case study showcasing SEESys in a real-world scenario, where it is applied to a real-time audio error advisory system for human operators of a SLAM-enabled device. The results demonstrate that SEESys provides error estimates with an average end-to-end latency of 37.3 ms, and the audio error advisory reduces pose tracking error by 25%. 
    more » « less
    Free, publicly-accessible full text available November 8, 2025
  5. Free, publicly-accessible full text available December 1, 2025
  6. Free, publicly-accessible full text available July 27, 2025
  7. Free, publicly-accessible full text available September 30, 2025
  8. Free, publicly-accessible full text available July 15, 2025