skip to main content

Title: Augmented Reality: Telehealth Demonstration Application
Augmented Reality (AR) as a technology will improve the way we work and live in the future. The Microsoft HoloLens device allows for rendering of interactive virtual components into a real world space. The HoloLens is an augmented reality headset and can display these virtual components in front of the user’s eyes, so the data needed to complete a real-world task will always be available. The nature of a HoloLens device lends itself useful for applications in a healthcare setting. Potential benefits come from transitioning to a more hands-free environment such as allowing the logging of data while in sterile environments without needing to sterilize repeatedly from touching paper or tablet. This project developed an augmented reality (AR) application to include a care plan tracker established by a patient’s doctor to allow the patient to do daily tasks without a health care worker’s supervision. The application displays the medications that the patient needs to ingest, daily tasks to complete, and health data to record. The application allows the physician to retrieve useful patient information regularly without scheduled physicals. This project sets a baseline that will provide future developers with documentation, research, and this sample application to assist in the design and construction of more complex applications in the future at the University of New Hampshire.
Authors:
; ; ;
Award ID(s):
1659377
Publication Date:
NSF-PAR ID:
10225096
Journal Name:
Practice and Experience in Advanced Research Computing (PEARC ’20)
Page Range or eLocation-ID:
452 to 455
Sponsoring Org:
National Science Foundation
More Like this
  1. In a seminal article on augmented reality (AR) [7], Ron Azuma defines AR as a variation of virtual reality (VR), which completely immerses a user inside a synthetic environment. Azuma says “In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world” [7] (emphasis added). Typically, a user wears a tracked stereoscopic head-mounted display (HMD) or holds a smartphone, showing the real world through optical or video means, with superimposed graphics that provide the appearance of virtual content that is related to and registered with the real world. Whilemore »AR has been around since the 1960s [72], it is experiencing a renaissance of development and consumer interest. With exciting products from Microsoft (HoloLens), Metavision (Meta 2), and others; Apple’s AR Developer’s Kit (ARKit); and well-funded startups like Magic Leap [54], the future is looking even brighter, expecting that AR technologies will be absorbed into our daily lives and have a strong influence on our society in the foreseeable future.« less
  2. This poster presents the use of Augmented Reality (AR) and Virtual Reality (VR) to tackle 4 amongst the “14 Grand Challenges for Engineering in the 21st Century” identified by National Academy of Engineering. AR and VR are the technologies of the present and the future. AR creates a composite view by adding digital content to a real world view, often by using the camera of a smartphone and VR creates an immersive view where the user’s view is often cut off from the real world. The 14 challenges identify areas of science and technology that are achievable and sustainable tomore »assist people and the planet to prosper. The 4 challenges tackled using AR/VR application in this poster are: Enhance virtual reality, Advance personalized learning, Provide access to clean water, and Make solar energy affordable. The solar system VR application is aimed at tackling two of the engineering challenges: (1) Enhance virtual reality and (2) Advance personalized learning. The VR application assists the user in visualizing and understanding our solar system by using a VR headset. It includes an immersive 360 degree view of our solar system where the user can use controllers to interact with celestial bodies-related information and to teleport to different points in the space to have a closer look at the planets and the Sun. The user has six degrees of freedom. The AR application for water tackles the engineering challenge: “Provide access to clean water”. The AR water application shows information on drinking water accessibility and the eco-friendly usage of bottles over plastic cups within the department buildings inside Auburn University. The user of the application has an augmented view of drinking water information on a smartphone. Every time the user points the smartphone camera towards a building, the application will render a composite view with drinking water information associated to the building. The Sun path visualization AR application tackles the engineering challenge: “Make solar energy affordable”. The application helps the user visualize sun path at a selected time and location. The sun path is augmented in the camera view of the device when the user points the camera towards the sky. The application provides information on sun altitude and azimuth. Also, it provides the user with sunrise and sunset data for a selected day. The information provided by the application can aid the user with effective solar panel placement. Using AR and VR technology to tackle these challenges enhances the user experience. The information from these applications are better curated and easily visualized, thus readily understandable by the end user. Therefore, usage of AR and VR technology to tackle these type of engineering challenges looks promising.« less
  3. Smart bracelets able to interpret the wearer's emotional state and communicate it to a remote decision-support facility will have broad applications in healthcare, elder care, the military, and other fields. While there are existing commercial embedded devices, such as the Apple Watch, that have health-monitoring sensors, such devices cannot sufficiently support a real-time health-monitoring system with battery-efficient remote data delivery. Ongoing R&D is developing solutions capable of monitoring multiple psycho-physiological signals. Possible hardware configurations include wrist-worn devices and sensors across an augmented reality headset (e.g., HoloLens 2). The device should carry an array of sensors of psycho-physiological signals, including amore »galvanic skin response sensor, motion sensor, skin temperature sensor, and a heart rate sensor. Output from these sensors can be intelligently fused to monitor the affective state and to determine specific trigger events for the wearer. To enable real-time remote monitoring applications, the device needs to be low-power to allow persistent monitoring while prolonging usage before recharging. For many applications, specialized sensor arrays are required, e.g. a galvanic skin response sensor. An application-flexible device would allow adding/removing sensors and would provide a choice of communication modules (e.g., Bluetooth 5.0 low-energy vs ZigBee). Appropriate configurations of the device would support applications in military health monitoring, drug-addiction mitigation, autistic trigger monitoring, and augmented reality exploration. A configuration example is: motion sensors (3-axis accelerometers, gyroscopes, and magnetometers to track steps, falls, and energy usage), a heart-rate sensor (e.g., an optical-based heart rate sensor with a single monitoring zone using the process of photoplethysmography (PPS)), at least a Bluetooth 5.0 (but a different communication device may be needed depending on the use case), and flash memory to temporarily store data when the device is not remotely communicating. The wearables field has greatly advanced in the quality of sensors; the fusion of multi-sensor data is the current frontier.« less
  4. Augmented reality (AR) technologies, such as Microsoft’s HoloLens head-mounted display and AR-enabled car windshields, are rapidly emerging. AR applications provide users with immersive virtual experiences by capturing input from a user’s surroundings and overlaying virtual output on the user’s perception of the real world. These applications enable users to interact with and perceive virtual content in fundamentally new ways. However, the immersive nature of AR applications raises serious security and privacy concerns. Prior work has focused primarily on input privacy risks stemming from applications with unrestricted access to sensor data. However, the risks associated with malicious or buggy AR outputmore »remain largely unexplored. For example, an AR windshield application could intentionally or accidentally obscure oncoming vehicles or safety-critical output of other AR applications. In this work, we address the fundamental challenge of securing AR output in the face of malicious or buggy applications. We design, prototype, and evaluate Arya, an AR platform that controls application output according to policies specified in a constrained yet expressive policy framework. In doing so, we identify and overcome numerous challenges in securing AR output.« less
  5. null (Ed.)
    Mobile Augmented Reality (AR) provides immersive experiences by aligning virtual content (holograms) with a view of the real world. When a user places a hologram it is usually expected that like a real object, it remains in the same place. However, positional errors frequently occur due to inaccurate environment mapping and device localization, to a large extent determined by the properties of natural visual features in the scene. In this demonstration we present SceneIt, the first visual environment rating system for mobile AR based on predictions of hologram positional error magnitude. SceneIt allows users to determine if virtual content placedmore »in their environment will drift noticeably out of position, without requiring them to place that content. It shows that the severity of positional error for a given visual environment is predictable, and that this prediction can be calculated with sufficiently high accuracy and low latency to be useful in mobile AR applications.« less