skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Developing an Augmented Reality-Based Interactive Learning System with Real-Time Location and Motion Tracking
This study aims to develop an interactive learning solution for engineering education by combining augmented reality (AR), Near-Field Electromagnetic Ranging (NFER), and motion capture technologies. We built an instructional system that integrates AR devices and real-time positioning sensors to improve the interactive experience of learners in an immersive learning environment, while the motion, eye-tracking, and location-tracking data collected by the devices applied to learners enable instructors to understand their learning patterns. To test the usability of the system, two AR-based lectures were developed with different difficulty levels (Lecture 1 - Easy vs. Lecture 2 - Hard), and the System Usability Scale (SUS) was collected from thirty participants. We did not observe a significant usability difference between Lecture 1 and Lecture 2. Through the experiment, we demonstrated the robustness of this AR learning system and its unique promise in integrating AR teaching with other technologies.  more » « less
Award ID(s):
2202108
PAR ID:
10513854
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
Springer Nature Switzerland
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In this study, we explore the impact of incorporating a virtual instructor with realistic lip-syncing in an augmented reality (AR) learning environment. The study is particularly focused on understanding if this enhancement can reduce students’ mental workload and improve system usability and performance in AR learning. The research stems from previous feedback indicating that a virtual instructor without facial movements was perceived as “creepy” and “distracting.” The updated virtual instructor includes facial animations, such as blinking and synchronized lip movements, especially during lecture explanations. The study aims to determine if there are significant changes in mental workload and usability differences between the AR systems with and without the enhanced virtual instructor. The study found significant differences in the usability scores in some questions. However, there was no significant difference in the mental workload between them. 
    more » « less
  2. This paper presents EARFace , a system that shows the feasibility of tracking facial landmarks for 3D facial reconstruction using in-ear acoustic sensors embedded within smart earphones. This enables a number of applications in the areas of facial expression tracking, user-interfaces, AR/VR applications, affective computing, accessibility, etc. While conventional vision-based solutions break down under poor lighting, occlusions, and also suffer from privacy concerns, earphone platforms are robust to ambient conditions, while being privacy-preserving. In contrast to prior work on earable platforms that perform outer-ear sensing for facial motion tracking, EARFace shows the feasibility of completely in-ear sensing with a natural earphone form-factor, thus enhancing the comfort levels of wearing. The core intuition exploited by EARFace is that the shape of the ear canal changes due to the movement of facial muscles during facial motion. EARFace tracks the changes in shape of the ear canal by measuring ultrasonic channel frequency response (CFR) of the inner ear, ultimately resulting in tracking of the facial motion. A transformer based machine learning (ML) model is designed to exploit spectral and temporal relationships in the ultrasonic CFR data to predict the facial landmarks of the user with an accuracy of 1.83 mm. Using these predicted landmarks, a 3D graphical model of the face that replicates the precise facial motion of the user is then reconstructed. Domain adaptation is further performed by adapting the weights of layers using a group-wise and differential learning rate. This decreases the training overhead in EARFace . The transformer based ML model runs on smartphone devices with a processing latency of 13 ms and an overall low power consumption profile. Finally, usability studies indicate higher levels of comforts of wearing EARFace ’s earphone platform in comparison with alternative form-factors. 
    more » « less
  3. With the growing need for augmented reality (AR) technology, understanding and optimizing study behaviors in AR learning environments has become crucial. However, one major drawback of AR learning is the absence of effective feedback mechanisms for students. To overcome this challenge, we introduced metacognitive monitoring feedback. Additionally, we created a location-based AR learning environment utilizing a real-time indoor tracking system to further enhance student learning. This study focuses on the positive impact of metacognitive monitoring feedback in a location-based AR learning environment. Our hypothesis posits that regularly providing students with feedback on their metacognitive monitoring within this new AR learning system positively influences their metacognitive awareness. The study's findings confirm that frequent exposure to such feedback significantly enhances the Metacognitive Awareness Inventory (MAI) scores. Participants who received continuous feedback demonstrated a significant increase in MAI scores compared to those who received feedback only once after the lecture. This improvement is achieved by influencing student calibration and directly enhancing their metacognitive awareness. 
    more » « less
  4. Personalized learning, which customizes content and instructional sequences to account for differences in ability, experience, and sociocultural backgrounds, holds great promise for transforming education. This transformation is increasingly driven by significant advancements in Artificial Intelligence (AI). AI enables detailed analysis and reporting of learners' performance data, paving the way for the development of intelligent adaptive learning systems that offer personalized feedback aligned with each learner’s unique needs and progress. In parallel, immersive technologies are playing a pivotal role in enhancing educational experiences. Technologies such as Virtual Reality (VR) and Augmented Reality (AR) create engaging, interactive environments that deepen learners' understanding and retention of complex concepts. Dr. Vassigh's presentation explores the integration of AI and VR in education, illustrated through a case study from an ongoing project. The talk will highlight the refinement of learning processes through these technologies and demonstrate how they can impact learner engagement and performance. 
    more » « less
  5. Augmented Reality revolutionises education by enhancing learning with interactive, immersive experiences. However, the impact of long-term AR use, particularly in terms of physical demand, within educational environments remains poorly understood. This study investigates the relationship between AR engagement and physical demand, utilising motion capture technology, NASA Task Load Index, and HoloLens eye-tracking to quantify user posture, engagement, and perceived workload. We hypothesise that prolonged AR interaction results in a change in slouching scores, indicating increased fatigue. The results show a strong correlation between the slouching score and the NASA-TLX physical demand score. Our study lays the groundwork for incorporating predictive modelling to develop proactive physical demand measures. 
    more » « less