Digital technology has evolved towards a new way of processing information: web searches, social platforms, internet forums, and video games have substituted reading books and writing essays. Trainers and educators currently face the challenge of providing natural training and learning environments for digital-natives. In addition to physical spaces, effective training and education require virtual spaces. Digital twins enable trainees to interact with real hardware in virtual training environments. Interactive real-world elements are essential in the training of robot operators. A natural environment for the trainee supports an interesting learning experience while including enough professional substances. We present examples of how virtual environments utilizing digital twins and extended reality can be applied to enable natural and effective robotic training scenarios. Scenarios are validated using cross-platform client devices for extended reality implementations and safety training applications.
more »
« less
This content will become publicly available on December 30, 2025
Fostering Computational Thinking Through Virtual Reality to Enhance Human-Robot Collaboration: A Technological-Pedagogical Framework
Virtual reality, a well-established educational technology, offers unique affordances such as immersion, interactivity, visualization, and co-presence, with significant potential to enhance learning experiences and outcomes. Computational thinking, a vital skill in science, technology, engineering, and mathematics, is essential for effective human-robot collaboration, enabling efficient problem-solving and decision-making in future work environments. Virtual reality provides a cost-effective, safe alternative to physical interaction with robots, reducing equipment risks and addressing the limitations of physical training. This study examines how virtual reality's affordances support computational thinking development, presenting a forward-looking training scenario and an assessment rubric for evaluation. The proposed framework and design strategies offer technological and pedagogical guidance for creating virtual reality environments that foster computational thinking in human-robot collaboration contexts.
more »
« less
- Award ID(s):
- 2222890
- PAR ID:
- 10576071
- Publisher / Repository:
- IGI Global
- Date Published:
- Journal Name:
- International Journal of Innovative Teaching and Learning in Higher Education
- Volume:
- 5
- Issue:
- 1
- ISSN:
- 2644-1624
- Page Range / eLocation ID:
- 1 to 17
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The reality of COVID-19 public health concerns and increasing demand for distance education have forced educators to move to online delivery of their courses. Particularly in construction education, the majority of physical location-based educational activities (e.g., labs, site visits, or field trips) have been canceled during the pandemic that results in reducing students’ engagement, learning motivation, and cognitive achievement. Virtual Social Spaces (VSS) with innovative interaction affordances and immersive experience are well poised to supplement current online construction education. This paper discusses the potentials of VSS for construction education while focusing on the common applications of VSS, the communication and collaboration affordances of VSS, and design principles of this technology based on 15 popular VSS platforms. Overall, VSS applications are mainly found in education, entertainment, and socializing. The main communication and collaboration affordances of VSS include avatars, multi-user support, asynchronous commenting, synchronous chat, and visual-sharing affordances. These technical features illustrate the potentials of VSS for improving online construction education quality, eliminating the challenges associated with geographical dispersion of students, and decreasing the students’ lack of engagement.more » « less
-
Gonzalez, D. (Ed.)Today’s research on human-robot teaming requires the ability to test artificial intelligence (AI) algorithms for perception and decision-making in complex real-world environments. Field experiments, also referred to as experiments “in the wild,” do not provide the level of detailed ground truth necessary for thorough performance comparisons and validation. Experiments on pre-recorded real-world data sets are also significantly limited in their usefulness because they do not allow researchers to test the effectiveness of active robot perception and control or decision strategies in the loop. Additionally, research on large human-robot teams requires tests and experiments that are too costly even for the industry and may result in considerable time losses when experiments go awry. The novel Real-Time Human Autonomous Systems Collaborations (RealTHASC) facility at Cornell University interfaces real and virtual robots and humans with photorealistic simulated environments by implementing new concepts for the seamless integration of wearable sensors, motion capture, physics-based simulations, robot hardware and virtual reality (VR). The result is an extended reality (XR) testbed by which real robots and humans in the laboratory are able to experience virtual worlds, inclusive of virtual agents, through real-time visual feedback and interaction. VR body tracking by DeepMotion is employed in conjunction with the OptiTrack motion capture system to transfer every human subject and robot in the real physical laboratory space into a synthetic virtual environment, thereby constructing corresponding human/robot avatars that not only mimic the behaviors of the real agents but also experience the virtual world through virtual sensors and transmit the sensor data back to the real human/robot agent, all in real time. New cross-domain synthetic environments are created in RealTHASC using Unreal Engine™, bridging the simulation-to-reality gap and allowing for the inclusion of underwater/ground/aerial autonomous vehicles, each equipped with a multi-modal sensor suite. The experimental capabilities offered by RealTHASC are demonstrated through three case studies showcasing mixed real/virtual human/robot interactions in diverse domains, leveraging and complementing the benefits of experimentation in simulation and in the real world.more » « less
-
Abstract Fluid power systems can be expensive and difficult to access, making it challenging to provide hands-on training. This work discusses the incorporation of Mixed Reality (MR) technology in Fluid Power applications for providing a virtual training environment that simulates the behavior of fluid power systems, allowing users to receive immediate feedback on the system’s performance. Mixed reality is a digitized-based technology that integrates a virtual environment with our real world by utilizing real-world sensor data and computer models. This technology allows running simulations that examine the complexity of highly-coupled systems, producing new digital environments where physical and digital elements can interact in real-time. With all these features, MR technology can be a practical training tool for running virtual simulations that mimic real-life industry settings. It can extend the user with a virtual training environment, thus preparing the next generation of fluid power engineers and specialists. Throughout this work, we present the development and capabilities of a digitized virtual copy of a hydraulic excavator’s arm in an MR environment as a proof of concept. The MR arm module is developed and deployed using Microsoft’s Mixed Reality Tool Kit (MRTK) for Unity through HoloLens 2 MR headset. The MR development involves generating virtual copies of the mechanical and hydraulic subsystems, conducting the virtual assembly, and creating a user interface in the MR environment to visualize and interact with the model. The developed MR module enables visualizing the excavator’s internal structure, conducting the virtual assembly, and running virtual simulations, all of which assist in training future fluid power operators. It is an effective training tool that helps train junior engineers/technicians, cutting down on cost and time.more » « less
-
null (Ed.)Social robots hold the potential to be an effective and appropriate technology in reducing stress and improving the mental health of adolescents. In order to understand the effect of adolescent-to-robot disclosure on momentary stress, we conducted an exploratory, mixed-methods study with sixty-nine US adolescents (ages 14–21) in school settings. We compared a generic, minimalist robot interaction among three different robot embodiments: physical, digital computer screen, and immersive, virtual reality. We found participants’ momentary stress levels significantly decreased across multiple interactions over time. The physical and virtual reality embodiments were most effective for stress reduction. In addition, our qualitative findings provide unique insights into the types of stressors adolescents shared with the social robots as well as their experiences with the different interaction embodiments.more » « less
An official website of the United States government
