skip to main content


Title: Construction Worker-Drone Safety Training in a 360 Virtual Reality Environment: A Pilot Study
Integrating drones into construction sites can introduce new risks to workers who already work in hazardous environments. Consequently, several recent studies have investigated the safety challenges and solutions associated with this technology integration in construction. However, there is a knowledge gap about effectively communicating such safety challenges to construction professionals and students who may work alongside drones on job sites. In this study, a 360-degree virtual reality (VR) environment was created as a training platform to communicate the safety challenges of worker-drone interactions on construction jobsites. This pilot study assesses the learning effectiveness and user experience of the developed 360 VR worker-drone safety training, which provides an immersive device-agnostic learning experience. The result indicates that such 360 VR learning material could significantly increase the safety knowledge of users while delivering an acceptable user experience in most of its assessment criteria. The outcomes of this study will serve as a valuable resource for improving future worker-drone safety training materials.  more » « less
Award ID(s):
2024656
NSF-PAR ID:
10345722
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
EPiC Series in Built Environment
Volume:
3
ISSN:
2632-881X
Page Range / eLocation ID:
19 - 28
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Unoccupied Aerial Vehicles (UAVs), or drone technologies, with their high spatial resolution, temporal flexibility, and ability to repeat photogrammetry, afford a significant advancement in other remote sensing approaches for coastal mapping, habitat monitoring, and environmental management. However, geographical drone mapping and in situ fieldwork often come with a steep learning curve requiring a background in drone operations, Geographic Information Systems (GIS), remote sensing and related analytical techniques. Such a learning curve can be an obstacle for field implementation for researchers, community organizations and citizen scientists wishing to include introductory drone operations into their work. In this study, we develop a comprehensive drone training program for research partners and community members to use cost-effective, consumer-quality drones to engage in introductory drone mapping of coastal seagrass monitoring sites along the west coast of North America. As a first step toward a longer-term Public Participation GIS process in the study area, the training program includes lessons for beginner drone users related to flying drones, autonomous route planning and mapping, field safety, GIS analysis, image correction and processing, and Federal Aviation Administration (FAA) certification and regulations. Training our research partners and students, who are in most cases novice users, is the first step in a larger process to increase participation in a broader project for seagrass monitoring in our case study. While our training program originated in the United States, we discuss our experiences for research partners and communities around the globe to become more confident in introductory drone operations for basic science. In particular, our work targets novice users without a strong background in geographic research or remote sensing. Such training provides technical guidance on the implementation of a drone mapping program for coastal research, and synthesizes our approaches to provide broad guidance for using drones in support of a developing Public Participation GIS process. 
    more » « less
  2. The dissemination of information is a basic element of group cohesion. In honey bees (Apis mellifera Linnaeus 1758), like in other social insects, the principal method for colony-wide information exchange is communication via pheromones. This medium of communication allows multiple individuals to conduct tasks critical to colony survival. Social signaling also establishes conflict at the level of the individual who must tradeoff between attending to the immediate environment or the social demand. In this study we examined this conflict by challenging highly social worker honey bees, and less social male drone honey bees undergoing aversive training by presenting them with a social stress signal (isopentyl acetate, IPA). We utilized IPA exposure methods that caused lower learning performance in appetitive learning in workers. Exposure to isopentyl acetate (IPA) did not affect performance of drones and had a dose-specific effect on worker response, with positive effects diminishing at higher IPA doses. The IPA effects are specific because non-social cues, such as the odor cineole, improve learning performance in drones, and social homing signals (geraniol) did not have a discernible effect on drone or worker performance. We conclude that social signals do generate conflict and that response to them is dependent on signal relevance to the individual as well as the context. We discuss the effect of social signal on learning both related to its social role and potential evolutionary history.

     
    more » « less
  3. Engineering education aims to create a learning environment capable of developing vital engineering skill sets, preparing students to enter the workforce and succeed as future leaders. With all the rapid technological advancements, new engineering challenges continuously emerge, impeding the development of engineering skills. This insufficiency in developing the required skills resulted in high regression rates in students’ GPAs, resulting in industries reporting graduates’ unsatisfactory performance. From a pedagogical perspective, this problem is highly correlated with traditional learning methods that are inadequate for engaging students and improving their learning experience when adopted alone. Accordingly, educators have incorporated new learning methodologies to address the pre-defined problem and enhance the students’ learning experience. However, many of the currently adopted teaching methods still lack the potential to expose students to practical examples, and they are inefficient among engineering students, who tend to be active learners and prefer to use a variety of senses. To address this, our research team proposes integrating the technology of virtual reality (VR) into the laboratory work of engineering technology courses to improve the students’ learning experience and engagement. VR technology, an immersive high-tech media, was adopted to develop an interactive teaching module on hydraulic gripper designs in a VR construction-like environment. The module aims to expose engineering technology students to real-life applications by providing a more visceral experience than screen-based media through the generation of fully computer-simulated environments in which everything is digitized. This work presents the development and implementation of the VR construction lab module and the corresponding gripper designs. The virtual gripper models are developed using Oculus Virtual Reality (OVR) Metrics Tool for Unity, a Steam VR Overlay utility created to make visualizing the desktop in a VR setting simple and intuitive. The execution of the module comprises building the VR environment, designing and importing the gripper models, and creating a user-interface VR environment to visualize and interact with the model (gripper assembly/mechanism testing). Besides the visualization, manipulation, and interaction, the developed VR system allows for additional features like displaying technical information, guiding students throughout the assembly process, and other specialized options. Thus, the developed interactive VR module will serve as a perpetual mutable platform that can be readily adjusted to allow future add-ons to address future educational opportunities. 
    more » « less
  4. Underwater robots, including Remote Operating Vehicles (ROV) and Autonomous Underwater Vehicles (AUV), are currently used to support underwater missions that are either impossible or too risky to be performed by manned systems. In recent years the academia and robotic industry have paved paths for tackling technical challenges for ROV/AUV operations. The level of intelligence of ROV/AUV has increased dramatically because of the recent advances in low-power-consumption embedded computing devices and machine intelligence (e.g., AI). Nonetheless, operating precisely underwater is still extremely challenging to minimize human intervention due to the inherent challenges and uncertainties associated with the underwater environments. Proximity operations, especially those requiring precise manipulation, are still carried out by ROV systems that are fully controlled by a human pilot. A workplace-ready and worker-friendly ROV interface that properly simplifies operator control and increases remote operation confidence is the central challenge for the wide adaptation of ROVs.

    This paper examines the recent advances of virtual telepresence technologies as a solution for lowering the barriers to the human-in-the-loop ROV teleoperation. Virtual telepresence refers to Virtual Reality (VR) related technologies that help a user to feel that they were in a hazardous situation without being present at the actual location. We present a pilot system of using a VR-based sensory simulator to convert ROV sensor data into human-perceivable sensations (e.g., haptics). Building on a cloud server for real-time rendering in VR, a less trained operator could possibly operate a remote ROV thousand miles away without losing the minimum situational awareness. The system is expected to enable an intensive human engagement on ROV teleoperation, augmenting abilities for maneuvering and navigating ROV in unknown and less explored subsea regions and works. This paper also discusses the opportunities and challenges of this technology for ad hoc training, workforce preparation, and safety in the future maritime industry. We expect that lessons learned from our work can help democratize human presence in future subsea engineering works, by accommodating human needs and limitations to lower the entrance barrier.

     
    more » « less
  5. Drones are increasingly used during routine inspections of bridges to improve data consistency, work efficiency, inspector safety, and cost effectiveness. Most drones, however, are operated manually within a visual line of sight and thus unable to inspect long-span bridges that are not completely visible to operators. In this paper, aerial nondestructive evaluation (aNDE) will be envisioned for elevated structures such as bridges, buildings, dams, nuclear power plants, and tunnels. To enable aerial nondestructive testing (aNDT), a human-robot system will be created to integrate haptic sensing and dexterous manipulation into a drone or a structural crawler in augmented/virtual reality (AR/VR) for beyond-visual-line-of-sight (BVLOS) inspection of bridges. Some of the technical challenges and potential solutions associated with aNDT&E will be presented. Example applications of the advanced technologies will be demonstrated in simulated bridge decks with stipulated conditions. The developed human-robot system can transform current on-site inspection to future tele-inspection, minimizing impact to traffic passing over the bridges. The automated tele-inspection can save as much as 75% in time and 95% in cost.

     
    more » « less