skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Crime Data Visualization Using Virtual Reality and Augmented Reality
Award ID(s):
2319752 2321574 2321539 2118285
PAR ID:
10544802
Author(s) / Creator(s):
; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
979-8-3503-6151-3
Page Range / eLocation ID:
646 to 651
Format(s):
Medium: X
Location:
Las Vegas, NV, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Ghandeharizadeh S. (Ed.)
    This paper provides an overview of different forms of reality, comparing and contrasting them with one another. It argues the definition of the term "reality" is ambiguous. This motivates an internalization of elements from a technology standpoint, e.g., biological, 3D printed, Flying Light Speck illuminations, etc. 
    more » « less
  2. As the development of extended reality technologies bring us closer to what some call the metaverse, it is valuable to investigate how our perception of color translates from physical, reflective objects to emissive and transparent virtual renderings. Colorimetry quantifies color stimuli and color differences, and color appearance models account for adaptation and illuminance level. However, these tools do not extent satisfactorily to the novel viewing experiences of extended reality. Ongoing research aims to understand the perception of layered virtual stimuli in optical see-through augmented reality with the goal of improving or extending color appearance models. This will help ensure robust, predictable color reproduction in extended reality experiences. 
    more » « less
  3. As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective. 
    more » « less
  4. Augmented Reality (AR) experiences tightly associate virtual contents with environmental entities. However, the dissimilarity of different environments limits the adaptive AR content behaviors under large-scale deployment. We propose ScalAR, an integrated workflow enabling designers to author semantically adaptive AR experiences in Virtual Reality (VR). First, potential AR consumers collect local scenes with a semantic understanding technique. ScalAR then synthesizes numerous similar scenes. In VR, a designer authors the AR contents’ semantic associations and validates the design while being immersed in the provided scenes. We adopt a decision-tree-based algorithm to fit the designer’s demonstrations as a semantic adaptation model to deploy the authored AR experience in a physical scene. We further showcase two application scenarios authored by ScalAR and conduct a two-session user study where the quantitative results prove the accuracy of the AR content rendering and the qualitative results show the usability of ScalAR. 
    more » « less
  5. This project addresses the urgent need for inclusive and scalable robotics training in architecture, engineering, and construction (AEC) through the integration of artificial intelligence (AI) and extended reality (XR) technologies. In collaboration with three Minority Serving Institutions (Florida International University, Arizona State University, and University of Hawai‘i at Mānoa), we developed and tested immersive, adaptive learning environments that personalize robotics education for diverse student populations. These efforts include a VR-based curriculum for industrial robotics, an AR curriculum for environmental sensing technologies, and an overarching Robotics Academy framework that promotes open knowledge exchange and workforce connectivity. By combining real-time performance analytics, natural language processing, and biometric inputs, our systems support individualized learning paths and help mitigate algorithmic bias. This research advances equitable access to robotics education and provides a replicable model for technology-driven workforce development in the AEC sector. Ongoing evaluation demonstrates improved learner engagement, accessibility, and cross-platform skill transferability. 
    more » « less