Augmented reality (AR) is a technology that integrates 3D virtual objects into the physical world in real-time, while virtual reality (VR) is a technology that immerses users in an interactive 3D virtual environment. The fast development of augmented reality (AR) and virtual reality (VR) technologies has reshaped how people interact with the physical world. This presentation will outline the results from two unique AR and one Web-based VR coastal engineering projects, motivating the next stage in the development of the augmented reality package for coastal students, engineers, and planners.
more »
« less
The Effects of Laboratory Environment Type on Intermittent Sound Localization
Virtual reality (VR) and augmented reality (AR) are gaining commercial popularity. 3D sound guidelines for AR and VR are derived from psychoacoustic experiments performed in contrived, sterile laboratory settings. Often, these settings are expensive, inaccessible, and unattainable for researchers. The feasibility of conducting psychoacoustic experiments outside the laboratory remains unclear. To investigate, we explore 3D sound localization experiments in-lab (IL) and out-of-the lab (OL). The IL study condition was conducted as a traditional psychoacoustic experiment in a soundproof booth. The OL condition occurred in a quiet environment of the participants' choosing, using commercial-grade headphones. Localization performance did not vary significantly for OL participants compared to the IL participants, with larger variation observed in the IL condition. Participants needed significantly more time to complete the experiment IL than OL. The results suggest that conducting headphone-based psychoacoustic experiments outside the laboratory is feasible if completion time is negligible.
more »
« less
- Award ID(s):
- 1845324
- PAR ID:
- 10439428
- Date Published:
- Journal Name:
- Proceedings of the Human Factors and Ergonomics Society Annual Meeting
- Volume:
- 66
- Issue:
- 1
- ISSN:
- 2169-5067
- Page Range / eLocation ID:
- 1235 to 1239
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Recent innovations in virtual and mixed-reality (VR/MR) technologies have enabled innovative hands-on training applications in high-risk/high-value fields such as medicine, flight, and worker-safety. Here, we present a detailed description of a novel VR/MR tactile user interactions/interface (TUI) hardware and software development framework that enables the rapid and cost-effective no-code development, optimization, and distribution of fully authentic hands-on VR/MR laboratory training experiences in the physical and life sciences. We applied our framework to the development and optimization of an introductory pipette calibration activity that is often carried out in real chemistry and biochemistry labs. Our approach provides users with nuanced real-time feedback on both their psychomotor skills during data acquisition and their attention to detail when conducting data analysis procedures. The cost-effectiveness of our approach relative to traditional face-to-face science labs improves access to quality hands-on science lab experiences. Importantly, the no-code nature of this Hands-On Virtual-Reality (HOVR) Lab platform enables faculties to iteratively optimize VR/MR experiences to meet their student’s targeted needs without costly software development cycles. Our platform also accommodates TUIs using either standard virtual-reality controllers (VR TUI mode) or fully functional hand-held physical lab tools (MR TUI mode). In the latter case, physical lab tools are strategically retrofitted with optical tracking markers to enable tactile, experimental, and analytical authenticity scientific experimentation. Preliminary user study data highlights the strengths and weaknesses of our generalized approach regarding student affective and cognitive student learning outcomes.more » « less
-
Extended reality (XR) technologies, such as virtual reality (VR) and augmented reality (AR), provide users, their avatars, and embodied agents a shared platform to collaborate in a spatial context. Although traditional face-to-face communication is limited by users’ proximity, meaning that another human’s non-verbal embodied cues become more difficult to perceive the farther one is away from that person, researchers and practitioners have started to look into ways to accentuate or amplify such embodied cues and signals to counteract the effects of distance with XR technologies. In this article, we describe and evaluate the Big Head technique, in which a human’s head in VR/AR is scaled up relative to their distance from the observer as a mechanism for enhancing the visibility of non-verbal facial cues, such as facial expressions or eye gaze. To better understand and explore this technique, we present two complimentary human-subject experiments in this article. In our first experiment, we conducted a VR study with a head-mounted display to understand the impact of increased or decreased head scales on participants’ ability to perceive facial expressions as well as their sense of comfort and feeling of “uncannniness” over distances of up to 10 m. We explored two different scaling methods and compared perceptual thresholds and user preferences. Our second experiment was performed in an outdoor AR environment with an optical see-through head-mounted display. Participants were asked to estimate facial expressions and eye gaze, and identify a virtual human over large distances of 30, 60, and 90 m. In both experiments, our results show significant differences in minimum, maximum, and ideal head scales for different distances and tasks related to perceiving faces, facial expressions, and eye gaze, and we also found that participants were more comfortable with slightly bigger heads at larger distances. We discuss our findings with respect to the technologies used, and we discuss implications and guidelines for practical applications that aim to leverage XR-enhanced facial cues.more » « less
-
null (Ed.)Augmented reality (AR) has the potential to fundamentally transform science education by making learning of abstract science ideas tangible and engaging. However, little is known about how students interacted with AR technologies and how these interactions may affect learning performance in science laboratories. This study examined high school students’ navigation patterns and science learning with a mobile AR technology, developed by the research team, in laboratory settings. The AR technology allows students to conduct hands-on laboratory experiments and interactively explore various science phenomena covering biology, chemistry, and physics concepts. In this study, seventy ninth-grade students carried out science laboratory experiments in pairs to learn thermodynamics. Our cluster analysis identified two groups of students, which differed significantly in navigation length and breadth. The two groups demonstrated unique navigation patterns that revealed students’ various ways of observing, describing, exploring, and evaluating science phenomena. These navigation patterns were associated with learning performance as measured by scores on lab reports. The results suggested the need for providing access to multiple representations and different types of interactions with these representations to support effective science learning as well as designing representations and connections between representations to cultivate scientific reasoning skills and nuanced understanding of scientific processes.more » « less
-
Augmented reality (AR) is a powerful visualization tool to support learning of scientific concepts across learners of various ages. AR can make information otherwise invisible visible in the physical world in real-time. In this study, we are looking at a subset of data from a larger study (N=120), in which participant pairs interacted with an augmented sound producing speaker. We explored the learning behaviors in eight pairs of learners (N=16) who participated in an unstructured physics activity under two conditions: with or without AR. Comparing behaviors between the two experimental conditions, we found that AR affected learning in four different ways: participants in the AR condition (1) learned more about visual concepts (ex: magnetic field structures) but learned less about nonvisual content (ex: relationship between electricity and physical movement); (2) stopped exploring the system faster than NonAR participants; (3) used less aids in exploration and teaching; and (4) spent less time in teaching their collaborators. We discuss implications of those results for designing collaborative learning activities with augmented reality.more » « less
An official website of the United States government

