skip to main content


Title: Mobile Augmented Reality in the Backyard: Families’ Outdoor Spaces as Sites of Exploration about Pollinators
From the first iteration of a design-based research study with 16 families, we investigated at-home intergenerational exploration of pollinators and plants. The team developed a mobile augmented reality app focused on plant-pollinator interactions. We investigated how AR elements influence families’ learning in their backyards. This analysis informs the design of mobile augmented reality apps that are site-independent for families’ collaborative learning opportunities in outdoor, home-based settings.  more » « less
Award ID(s):
1811424
NSF-PAR ID:
10272357
Author(s) / Creator(s):
; ; ; ; ; ;
Editor(s):
de Vries, Erica; Yotam Hod, Yotam; Ahn, June
Date Published:
Journal Name:
15th International Conference of the Learning Sciences (ICLS) Proceedings
ISSN:
1573-4552
Page Range / eLocation ID:
721-724
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    We investigated how families experienced immersion as they collaboratively made sense of geologic time and geoscience processes during a place-based, learning-on-the-move (LOTM) experience mediated by a mobile augmented reality (MAR) app. Our team developed an MAR app,Time Explorers, that focused on how rock-water interactions shaped Appalachia over millions of years. Data were collected at the Children’s Garden at the Arboretum at Penn State. Data sources were videos of app usage, point-of-view camera recordings with audio capturing family conversations, and interviews from 17 families (51 people). The analytical technique was interaction analysis, in which episodes of family sense-making were identified and developed into qualitative vignettes focused on how immersion did or did not support learning about geoscience and geologic time. We analyzed how design elements supported sensory, actional, narrative, and social immersion through photo-taking, discussion prompts, and augmented reality visualizations. Findings showed that sensory and social immersion supported sense-making conversations and observational inquiry, while narrative and actional immersion supported deep family engagement with the geoscience content. At many micro-sites of learning, families engaged in multiple immersive processes where conversations, observational inquiry, and deep engagement with the geoscience came together during LOTM. This analysis contributes to the CSCL literature on theory related to LOTM in outdoor informal settings, while also providing design conjectures in an immersive, family-centered, place-based LOTM framework.

     
    more » « less
  2. From a design-based research study with 31 families, we share the design conjectures that guided the first two iterations of research. The team developed a mobile augmented reality app focused on water-rock interactions to make earth sciences appealing to rural families. We iterated on one design element, the augmented reality visualizations, to understand how these AR elements influence families’ learning behavior in a children’s garden cave as well as their resulting geosciences knowledge. This analysis is an example of how design conjecture maps can be used to support research and development of mobile computer-supported collaborative learning opportunities for families in outdoor, informal learning settings. 
    more » « less
  3. Chinn, C ; Tan, E. ; Chan, C ; Kali Y. (Ed.)
    From a design-based research study investigating rural families’ science learning with mobile devices, we share findings related to the intergenerational exploration of geological time concepts at a children’s garden at a university arboretum. The team developed a mobile augmented reality app, Time Explorers, focused on how millions of years of rock-water interactions shaped Appalachia. Data are recorded videos of app usage and interviews from 17 families (51 people); videos were transcribed, coded, and developed into qualitative case studies. We present results related to design elements that supported sensory engagement (e.g., observation, touch) through AR visualizations related to geological history. This analysis contributes to the literature on informal learning environments, theory related to learning-on- the-move, and the role of sensory engagement with AR experiences in outdoor learning. 
    more » « less
  4. Real-time on-device continual learning is needed for new applications such as home robots, user personalization on smartphones, and augmented/virtual reality headsets. However, this setting poses unique challenges: embedded devices have limited memory and compute capacity and conventional machine learning models suffer from catastrophic forgetting when updated on non-stationary data streams. While several online continual learning models have been developed, their effectiveness for embedded applications has not been rigorously studied. In this paper, we first identify criteria that online continual learners must meet to effectively perform real-time, on-device learning. We then study the efficacy of several online continual learning methods when used with mobile neural networks. We measure their performance, memory usage, compute requirements, and ability to generalize to out-of-domain inputs. 
    more » « less
  5. An accurate understanding of omnidirectional environment lighting is crucial for high-quality virtual object rendering in mobile augmented reality (AR). In particular, to support reflective rendering, existing methods have leveraged deep learning models to estimate or have used physical light probes to capture physical lighting, typically represented in the form of an environment map. However, these methods often fail to provide visually coherent details or require additional setups. For example, the commercial framework ARKit uses a convolutional neural network that can generate realistic environment maps; however the corresponding reflective rendering might not match the physical environments. In this work, we present the design and implementation of a lighting reconstruction framework called LITAR that enables realistic and visually-coherent rendering. LITAR addresses several challenges of supporting lighting information for mobile AR. First, to address the spatial variance problem, LITAR uses two-field lighting reconstruction to divide the lighting reconstruction task into the spatial variance-aware near-field reconstruction and the directional-aware far-field reconstruction. The corresponding environment map allows reflective rendering with correct color tones. Second, LITAR uses two noise-tolerant data capturing policies to ensure data quality, namely guided bootstrapped movement and motion-based automatic capturing. Third, to handle the mismatch between the mobile computation capability and the high computation requirement of lighting reconstruction, LITAR employs two novel real-time environment map rendering techniques called multi-resolution projection and anchor extrapolation. These two techniques effectively remove the need of time-consuming mesh reconstruction while maintaining visual quality. Lastly, LITAR provides several knobs to facilitate mobile AR application developers making quality and performance trade-offs in lighting reconstruction. We evaluated the performance of LITAR using a small-scale testbed experiment and a controlled simulation. Our testbed-based evaluation shows that LITAR achieves more visually coherent rendering effects than ARKit. Our design of multi-resolution projection significantly reduces the time of point cloud projection from about 3 seconds to 14.6 milliseconds. Our simulation shows that LITAR, on average, achieves up to 44.1% higher PSNR value than a recent work Xihe on two complex objects with physically-based materials. 
    more » « less