skip to main content


Title: Real-Time Data Analytics of COVID Pandemic Using Virtual Reality
Visualizing data effectively is critical for the discovery process in the age of big data. We are exploring the use of immersive virtual reality platforms for scientific data visualization for COVID-19 pandemic. We are interested in finding ways to better understand, perceive and interact with multidimensional data in the field of cognition technology and human-computer interaction. Immersive visualization leads to a better understanding and perception of relationships in the data. This paper presents a data visualization tool for immersive data visualizations based on the Unity development platform. The data visualization tool is capable of visualizing the real-time COVID pandemic data for the fifty states in the USA. Immersion provides a better understanding of the data than traditional desktop visualization tools and leads to more human-centric situational awareness insights. This research effort aims to identify how graphical objects like charts and bar graphs depicted in Virtual Reality tools, developed in accordance with an analyst’s mental model can enhance an analyst’s situation awareness. Our results also suggest that users feel more satisfied when using immersive virtual reality data visualization tools and thus demonstrate the potential of immersive data analytics.  more » « less
Award ID(s):
2026412 2032344 1923986
NSF-PAR ID:
10286142
Author(s) / Creator(s):
; ;
Editor(s):
Chen, J.Y.C.; Fragomeni, G
Date Published:
Journal Name:
Lecture notes in computer science
Volume:
12770
ISSN:
0302-9743
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Situational awareness provides the decision making capability to identify, process, and comprehend big data. In our approach, situational awareness is achieved by integrating and analyzing multiple aspects of data using stacked bar graphs and geographic representations of the data. We provide a data visualization tool to represent COVID pandemic data on top of the geographical information. The combination of geospatial and temporal data provides the information needed to conduct situational analysis for the COVID-19 pandemic. By providing interactivity, geographical maps can be viewed from different perspectives and offer insight into the dynamical aspects of the COVID-19 pandemic for the fifty states in the USA. We have overlaid dynamic information on top of a geographical representation in an intuitive way for decision making. We describe how modeling and simulation of data increase situational awareness, especially when coupled with immersive virtual reality interaction. This paper presents an immersive virtual reality (VR) environment and mobile environment for data visualization using Oculus Rift head-mounted display and smartphones. This work combines neural network predictions with human-centric situational awareness and data analytics to provide accurate, timely, and scientific strategies in combatting and mitigating the spread of the coronavirus pandemic. Testing and evaluation of the data visualization tool have been done with real-time feed of COVID pandemic data set for immersive environment, non-immersive environment, and mobile environment. 
    more » « less
  2. Active shooter events are not emergencies that can be reasonably anticipated. However, these events do occur more than we think, and there is a critical need for an effective emergency preparedness plan that can increase the likelihood of saving lives and reducing casualties in the event of an active shooting incident. There has been a major concern about the lack of tools available to allow for modeling and simulation of human behavior during emergency response training. Over the past few decades, virtual reality-based training for emergency response and decision making has been recognized as a novel alternative for disaster preparedness. This paper presents an immersive virtual reality (VR) training module for active shooter events for a building emergency response. There are two immersive active shooter modules developed: occupant’s module and Security personnel module. We have developed an immersive virtual reality training module for active shooter events using an Oculus for the course of action, visualization, and situational awareness for active shooter events. The immersive environment is implemented in Unity 3D where the user has an option to enter the environment as security personnel or as an occupant in the building. The immersive VR training module offers a unique platform for emergency response and decision making training. The platform allows for collecting data on different what-if scenarios in response to active shooter events that impact the actions of security personnel and occupants in a building. The data collected can be used to educate security personnel on how to reduce response times. Moreover, security personnel can be trained to respond to a variety of emergencies safely and securely without ever being exposed to real-world dangers. 
    more » « less
  3. null (Ed.)
    Real-time data visualization can enhance decision making and empower teams with human-centric situational awareness insights. Decision making relies on data which comes in overwhelming velocity and volume, that one cannot comprehend it without some layer of abstraction. This research effort aims to demonstrate the data visualization of the COVID pandemic in real-time for the fifty states in the USA. Our proposed data visualization tool includes both conceptual and data-driven information. The data visualization includes stacked bar graphs, geographic representations of the data, and offers situational awareness of the COVID-19 pandemic. This paper describes the development and testing of the data visualization tool using the Unity gaming engine. Testing has been done with a real-time feed of the COVID-19 data set for immersive environment, non-immersive environment, and mobile environment. 
    more » « less
  4. This poster presents the use of Augmented Reality (AR) and Virtual Reality (VR) to tackle 4 amongst the “14 Grand Challenges for Engineering in the 21st Century” identified by National Academy of Engineering. AR and VR are the technologies of the present and the future. AR creates a composite view by adding digital content to a real world view, often by using the camera of a smartphone and VR creates an immersive view where the user’s view is often cut off from the real world. The 14 challenges identify areas of science and technology that are achievable and sustainable to assist people and the planet to prosper. The 4 challenges tackled using AR/VR application in this poster are: Enhance virtual reality, Advance personalized learning, Provide access to clean water, and Make solar energy affordable. The solar system VR application is aimed at tackling two of the engineering challenges: (1) Enhance virtual reality and (2) Advance personalized learning. The VR application assists the user in visualizing and understanding our solar system by using a VR headset. It includes an immersive 360 degree view of our solar system where the user can use controllers to interact with celestial bodies-related information and to teleport to different points in the space to have a closer look at the planets and the Sun. The user has six degrees of freedom. The AR application for water tackles the engineering challenge: “Provide access to clean water”. The AR water application shows information on drinking water accessibility and the eco-friendly usage of bottles over plastic cups within the department buildings inside Auburn University. The user of the application has an augmented view of drinking water information on a smartphone. Every time the user points the smartphone camera towards a building, the application will render a composite view with drinking water information associated to the building. The Sun path visualization AR application tackles the engineering challenge: “Make solar energy affordable”. The application helps the user visualize sun path at a selected time and location. The sun path is augmented in the camera view of the device when the user points the camera towards the sky. The application provides information on sun altitude and azimuth. Also, it provides the user with sunrise and sunset data for a selected day. The information provided by the application can aid the user with effective solar panel placement. Using AR and VR technology to tackle these challenges enhances the user experience. The information from these applications are better curated and easily visualized, thus readily understandable by the end user. Therefore, usage of AR and VR technology to tackle these type of engineering challenges looks promising. 
    more » « less
  5. Dini, Petre (Ed.)
    The National Academy of Engineering’s “Fourteen Grand Challenges for Engineering in the Twenty-First Century” identifies challenges in science and technology that are both feasible and sustainable to help people and the planet prosper. Four of these challenges are: advance personalized learning, enhance virtual reality, make solar energy affordable and provide access to clean water. In this work, the authors discuss developing of applications using immersive technologies, such as Virtual Reality (VR) and Augmented Reality (AR) and their significance in addressing four of the challenges. The Drinking Water AR mobile application helps users easily locate drinking water sources inside Auburn University (AU) campus, thus providing easy access to clean water. The Sun Path mobile application helps users visualize Sun’s path at any given time and location. Students study Sun path in various fields but often have a hard time visualizing and conceptualizing it, therefore the application can help. Similarly, the application could possibly assist the users in efficient solar panel placement. Architects often study Sun path to evaluate solar panel placement at a particular location. An effective solar panel placement helps optimize degree of efficiency of using the solar energy. The Solar System Oculus Quest VR application enables users in viewing all eight planets and the Sun in the solar system. Planets are simulated to mimic their position, scale, and rotation relative to the Sun. Using the Oculus Quest controllers, disguised as human hands in the scene, users can teleport within the world view, and can get closer to each planet and the Sun to have a better view of the objects and the text associated with the objects. As a result, tailored learning is aided, and Virtual Reality is enhanced. In a camp held virtually, due to Covid-19, K12 students were introduced to the concept and usability of the applications. Likert scales metric was used to assess the efficacy of application usage. The data shows that participants of this camp benefited from an immersive learning experience that allowed for simulation with inclusion of VR and AR. 
    more » « less