skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: MoiréBoard: A Stable, Accurate and Low-cost Camera Tracking Method
Camera tracking is an essential building block in a myriad of HCI applications. For example, commercial VR devices are equipped with dedicated hardware, such as laser-emitting beacon stations, to enable accurate tracking of VR headsets. However, this hardware remains costly. On the other hand, low-cost solutions such as IMU sensors and visual markers exist, but they suffer from large tracking errors. In this work, we bring high accuracy and low cost together to present MoiréBoard, a new 3-DOF camera position tracking method that leverages a seemingly irrelevant visual phenomenon, the moiré effect. Based on a systematic analysis of the moiré effect under camera projection, MoiréBoard requires no power nor camera calibration. It can be easily made at a low cost (e.g., through 3D printing), ready to use with any stock mobile devices with a camera. Its tracking algorithm is computationally efficient, able to run at a high frame rate. Although it is simple to implement, it tracks devices at high accuracy, comparable to the state-of-the-art commercial VR tracking systems.  more » « less
Award ID(s):
1910839
PAR ID:
10414110
Author(s) / Creator(s):
;
Date Published:
Journal Name:
The 34th Annual ACM Symposium on User Interface Software and Technology
Page Range / eLocation ID:
881 to 893
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Eye tracking has become an essential human-machine interaction modality for providing immersive experience in numerous virtual and augmented reality (VR/AR) applications desiring high throughput (e.g., 240 FPS), small-form, and enhanced visual privacy. However, existing eye tracking systems are still limited by their: (1) large form-factor largely due to the adopted bulky lens-based cameras; (2) high communication cost required between the camera and backend processor; and (3) potentially concerned low visual privacy, thus prohibiting their more extensive applications. To this end, we propose, develop, and validate a lensless FlatCambased eye tracking algorithm and accelerator co-design framework dubbed EyeCoD to enable eye tracking systems with a much reduced form-factor and boosted system efficiency without sacrificing the tracking accuracy, paving the way for next-generation eye tracking solutions. On the system level, we advocate the use of lensless FlatCams instead of lens-based cameras to facilitate the small form-factor need in mobile eye tracking systems, which also leaves rooms for a dedicated sensing-processor co-design to reduce the required camera-processor communication latency. On the algorithm level, EyeCoD integrates a predict-then-focus pipeline that first predicts the region-of-interest (ROI) via segmentation and then only focuses on the ROI parts to estimate gaze directions, greatly reducing redundant computations and data movements. On the hardware level, we further develop a dedicated accelerator that (1) integrates a novel workload orchestration between the aforementioned segmentation and gaze estimation models, (2) leverages intra-channel reuse opportunities for depth-wise layers, (3) utilizes input feature-wise partition to save activation memory size, and (4) develops a sequential-write-parallel-read input buffer to alleviate the bandwidth requirement for the activation global buffer. On-silicon measurement and extensive experiments validate that our EyeCoD consistently reduces both the communication and computation costs, leading to an overall system speedup of 10.95×, 3.21×, and 12.85× over general computing platforms including CPUs and GPUs, and a prior-art eye tracking processor called CIS-GEP, respectively, while maintaining the tracking accuracy. Codes are available at https://github.com/RICE-EIC/EyeCoD. 
    more » « less
  2. J. Y. C., Chen (Ed.)
    Controlling and standardizing experiments is imperative for quantitative research methods. With the increase in the availability and quantity of low-cost eye-tracking devices, gaze data are considered as an important user input for quantitative analysis in many social science research areas, especially incorporating with virtual reality (VR) and augmented reality (AR) technologies. This poses new challenges in providing a default interface for gaze data in a common method. This paper propose GazeXR, which focuses on designing a general eye-tracking system interfacing two eye-tracking devices and creating a hardware independent virtual environment. We apply GazeXR to the in-class teaching experience analysis use case using external eye-tracking hardware to collect the gaze data for the gaze track analysis. 
    more » « less
  3. Eye-tracking is an essential tool in many fields, yet existing solutions are often limited for customized applications due to cost or lack of flexibility. We present OpenIris, an adaptable and user-friendly open-source framework for video-based eye-tracking. OpenIris is developed in C# with modular design that allows further extension and customization through plugins for different hardware systems, tracking, and calibration pipelines. It can be remotely controlled via a network interface from other devices or programs. Eye movements can be recorded online from camera stream or offline post-processing recorded videos. Example plugins have been developed to track eye motion in 3-D, including torsion. Currently implemented binocular pupil tracking pipelines can achieve frame rates of more than 500Hz. With the OpenIris framework, we aim to fill a gap in the research tools available for high-precision and high-speed eye-tracking, especially in environments that require custom solutions that are not currently well-served by commercial eye-trackers. 
    more » « less
  4. We present a first-of-its-kind ultra-compact intelligent camera system, dubbed i-FlatCam, including a lensless camera with a computational (Comp.) chip. It highlights (1) a predict-then-focus eye tracking pipeline for boosted efficiency without compromising the accuracy, (2) a unified compression scheme for single-chip processing and improved frame rate per second (FPS), and (3) dedicated intra-channel reuse design for depth-wise convolutional layers (DW-CONV) to increase utilization. i-FlatCam demonstrates the first eye tracking pipeline with a lensless camera and achieves 3.16 degrees of accuracy, 253 FPS, 91.49 µJ/Frame, and 6.7mm×8.9mm×1.2mm camera form factor, paving the way for next-generation Augmented Reality (AR) and Virtual Reality (VR) devices. 
    more » « less
  5. Simulators are an essential tool for behavioural and interaction research on driving, due to the safety, cost, and experimental control issues of on-road driving experiments. The most advanced simulators use expensive 360 degree projections systems to ensure visual fidelity, full field of view, and immersion. However, similar visual fidelity can be achieved affordably using a virtual reality (VR) based visual interface. We present DReyeVR, an open-source VR based driving simulator platform designed with behavioural and interaction research priorities in mind. DReyeVR (read ''driver'') is based on Unreal Engine and the CARLA autonomous vehicle simulator and has features such as eye tracking, a functional driving heads-up display (HUD) and vehicle audio, custom definable routes and traffic scenarios, experimental logging, replay capabilities, and compatibility with ROS. We describe the hardware required to deploy this simulator for under 5000 USD, much cheaper than commercially available simulators. Finally, we describe how DReyeVR may be leveraged to answer an interaction research question in an example scenario. DReyeVR is open-source at this url: https://github.com/HARPLab/DReyeVR 
    more » « less