Augmented reality (AR) is a technology that integrates 3D virtual objects into the physical world in real-time, while virtual reality (VR) is a technology that immerses users in an interactive 3D virtual environment. The fast development of augmented reality (AR) and virtual reality (VR) technologies has reshaped how people interact with the physical world. This presentation will outline the results from two unique AR and one Web-based VR coastal engineering projects, motivating the next stage in the development of the augmented reality package for coastal students, engineers, and planners.
more »
« less
Warehouse Augmented Reality Program (WARP): A Web Tool for Warehouse Design and Operation Education
In this paper, we introduce the Warehouse Augmented Reality Program (WARP), its functionality, practicality, and potential use cases in education. We build this application on the backbone of WebXR. Using this application programming interface (API), we create an interactive web tool that displays a life-sized warehouse in augmented reality (AR) in front of users that can be viewed on a smartphone or a tablet. AR is a technology that displays virtual objects in the real world on a digital device’s screen, allowing users to interact with virtual objects and locations while moving about a real-world environment. This tool can enhance warehousing education by making it immersive and more interactive. In addition, the tool can make warehousing operations more efficient and warehouse design less costly. We highlight how our tool can be applicable and beneficial to education and industry. We demonstrate how this tool can be integrated into a problem-based learning (PBL) assignment about warehouse layout design and order picking. The PBL activity involves comparing two different warehouse layouts (fishbone and traditional) by completing a set of order picking tasks in AR warehouse environments. The task is to perform single item picking over thirty orders and comparing the average order picking time per layout. Then, we use the results of these human subject experiments for validating the realism of the warehouse layouts generated by the tool by comparing the empirical completion times with the analytical results from the literature. We also administer a system usability scale (SUS) survey and collect feedback from industry experts.
more »
« less
- Award ID(s):
- 2000599
- PAR ID:
- 10528757
- Publisher / Repository:
- American Society for Engineering Education (ASEE)
- Date Published:
- Subject(s) / Keyword(s):
- Augmented reality Distribution center Web-based educational tool Problem-based learning
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Freehand gesture is an essential input modality for modern Augmented Reality (AR) user experiences. However, developing AR applications with customized hand interactions remains a challenge for end-users. Therefore, we propose GesturAR, an end-to-end authoring tool that supports users to create in-situ freehand AR applications through embodied demonstration and visual programming. During authoring, users can intuitively demonstrate the customized gesture inputs while referring to the spatial and temporal context. Based on the taxonomy of gestures in AR, we proposed a hand interaction model which maps the gesture inputs to the reactions of the AR contents. Thus, users can author comprehensive freehand applications using trigger-action visual programming and instantly experience the results in AR. Further, we demonstrate multiple application scenarios enabled by GesturAR, such as interactive virtual objects, robots, and avatars, room-level interactive AR spaces, embodied AR presentations, etc. Finally, we evaluate the performance and usability of GesturAR through a user study.more » « less
-
null (Ed.)Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public, especially the population of the blind and visually impaired (BVI), still cannot enjoy the benefit provided by VR. Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are either not accessible or inconvenient for BVI individuals. In this paper, we present a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR Cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes in contact with objects in VR. Thus, the MR Cane allows BVI individuals to interact with the virtual objects and identify approximate sizes and locations of the objects in the virtual environment. We performed preliminary user studies with blind-folded participants to investigate the effectiveness of the proposed mobile approach and the results indicate that the proposed MR Cane could be effective to help BVI individuals in understanding the interaction with virtual objects and exploring 3D virtual environments. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts.more » « less
-
Augmented reality (AR) is emerging as the next ubiquitous wearable technology and is expected to significantly transform various industries in the near future. There has been tremendous investment in developing AR eyeglasses in recent years, including about $45 billion investment by Meta since 2021. Despite such efforts, the existing displays are very bulky in form factor and there has not yet been a socially acceptable eyeglasses-style AR display. Such wearable display eyeglasses promise to unlock enormous potential in diverse applications such as medicine, education, navigation, and many more; but until eyeglass-style AR glasses are realized, those possibilities remain only a dream. My research addresses this problem and makes progress “towards everyday-use augmented reality eyeglasses” through computational imaging, displays, and perception. My dissertation (Chakravarthula, 2021) made advances in three key and seemingly distinct areas: first, digital holography and advanced algorithms for compact, high-quality, true 3-D holographic displays; second, hardware and software for robust and comprehensive 3-D eye tracking via Purkinje Images; and third, automatic focus adjusting AR display eyeglasses for well-focused virtual and real imagery, toward potentially achieving 20/20 vision for users of all ages.Not Availablemore » « less
-
null (Ed.)Augmented reality (AR) is an efficient form of delivering spatial information and has great potential for training workers. However, AR is still not widely used for such scenarios due to the technical skills and expertise required to create interactive AR content. We developed ProcessAR, an AR-based system to develop 2D/3D content that captures subject matter expert’s (SMEs) environment-object interactions in situ. The design space for ProcessAR was identified from formative interviews with AR programming experts and SMEs, alongside a comparative design study with SMEs and novice users. To enable smooth workflows, ProcessAR locates and identifies different tools/objects through computer vision within the workspace when the author looks at them. We explored additional features such as embedding 2D videos with detected objects and user-adaptive triggers. A final user evaluation comparing ProcessAR and a baseline AR authoring environment showed that, according to our qualitative questionnaire, users preferred ProcessAR.more » « less
An official website of the United States government

