skip to main content


Title: Sensory manipulation as a countermeasure to robot teleoperation delays: system and evidence
Abstract

In the realm of robotics and automation, robot teleoperation, which facilitates human–machine interaction in distant or hazardous settings, has surged in significance. A persistent issue in this domain is the delays between command issuance and action execution, causing negative repercussions on operator situational awareness, performance, and cognitive load. These delays, particularly in long-distance operations, are difficult to mitigate even with the most advanced computing advancements. Current solutions mainly revolve around machine-based adjustments to combat these delays. However, a notable lacuna remains in harnessing human perceptions for an enhanced subjective teleoperation experience. This paper introduces a novel approach of sensory manipulation for induced human adaptation in delayed teleoperation. Drawing from motor learning and rehabilitation principles, it is posited that strategic sensory manipulation, via altered sensory stimuli, can mitigate the subjective feeling of these delays. The focus is not on introducing new skills or adapting to novel conditions; rather, it leverages prior motor coordination experience in the context of delays. The objective is to reduce the need for extensive training or sophisticated automation designs. A human-centered experiment involving 41 participants was conducted to examine the effects of modified haptic cues in teleoperations with delays. These cues were generated from high-fidelity physics engines using parameters from robot-end sensors or physics engine simulations. The results underscored several benefits, notably the considerable reduction in task time and enhanced user perceptions about visual delays. Real-time haptic feedback, or the anchoring method, emerged as a significant contributor to these benefits, showcasing reduced cognitive load, bolstered self-confidence, and minimized frustration. Beyond the prevalent methods of automation design and training, this research underscores induced human adaptation as a pivotal avenue in robot teleoperation. It seeks to enhance teleoperation efficacy through rapid human adaptation, offering insights beyond just optimizing robotic systems for delay compensations.

 
more » « less
NSF-PAR ID:
10491907
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
Scientific Reports
Volume:
14
Issue:
1
ISSN:
2045-2322
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. With the advancement of automation and robotic technologies, the teleoperation has been leveraged as a promising solution for human workers in a hazardous construction work environment. Since human operators and construction sites are separated in a distance, teleoperation requires a seamless human-machine interface, an intermediate medium, to communicate between humans and machines in construction sites. Several types of teleoperation interfaces including conventional joysticks, haptic devices, graphic user interfaces, auditory interfaces, and tactile interfaces have been developed to control and command construction robotics remotely. The ultimate goal of human-machine interfaces for remote operations is to make intuitive sensory channels that can provide and receive enough information while reducing the associated cognitive and physical load on human operators. Previously developed interfaces have tried to achieve such goals, but each interface still has challenges that should be assessed for enhancing the future teleoperation application in construction workplaces. This paper examines different human-machine interfaces for excavator teleoperation in terms of its on-site usability and intuitiveness. The capabilities of the interfaces for excavator teleoperation are evaluated based on their limitations and requirements. The outcome is expected to provide better understanding of teleoperation interfaces for excavators and guiding future directions for addressing underlying challenges. 
    more » « less
  2. Objective: Haptic perception is an important component of bidirectional human-machine interactions that allow users to better interact with their environment. Artificial haptic sensation along an individual’s hand can be evoked via noninvasive electrical nerve stimulation; however, continuous stimulation can result in adaptation of sensory perception over time. In this study, we sought to quantify the adaptation profile via the change in perceived sensation intensity over time. Approach: Noninvasive stimulation of the peripheral nerve bundles evoked haptic perception using a 2x5 electrode grid placed along the medial side of the upper arm near the median and ulnar nerves. An electrode pair that evoked haptic sensation along the forearm and hand was selected. During a trial of 110-s of continuous stimulation, a constant stimulus amplitude just below the motor threshold was delivered. Each subject was instructed to press on a force transducer producing a force amplitude matched with the perceived intensity of haptic sensation. Main Findings: A force decay (i.e., intensity of sensation) was observed in all 7 subjects. Variations in the rate of decay and the start of decay across subjects were also observed. Significance: The preliminary findings established the sensory adaptation profile of peripheral nerve stimulation. Accounting for these subject-specific profiles of adaptation can allow for more stable communication between a robotic device and a user. Additionally, sensory adaptation characterization can promote the development of new stimulation strategies that can mitigate these observed adaptations, allowing for a better and more stable human-machine interaction experience. 
    more » « less
  3. Abstract

    ROV operations are mainly performed via a traditional control kiosk and limited data feedback methods, such as the use of joysticks and camera view displays equipped on a surface vessel. This traditional setup requires significant personnel on board (POB) time and imposes high requirements for personnel training. This paper proposes a virtual reality (VR) based haptic-visual ROV teleoperation system that can substantially simplify ROV teleoperation and enhance the remote operator's situational awareness.

    This study leverages the recent development in Mixed Reality (MR) technologies, sensory augmentation, sensing technologies, and closed-loop control, to visualize and render complex underwater environmental data in an intuitive and immersive way. The raw sensor data will be processed with physics engine systems and rendered as a high-fidelity digital twin model in game engines. Certain features will be visualized and displayed via the VR headset, whereas others will be manifested as haptic and tactile cues via our haptic feedback systems. We applied a simulation approach to test the developed system.

    With our developed system, a high-fidelity subsea environment is reconstructed based on the sensor data collected from an ROV including the bathymetric, hydrodynamic, visual, and vehicle navigational measurements. Specifically, the vehicle is equipped with a navigation sensor system for real-time state estimation, an acoustic Doppler current profiler for far-field flow measurement, and a bio-inspired artificial literal-line hydrodynamic sensor system for near-field small-scale hydrodynamics. Optimized game engine rendering algorithms then visualize key environmental features as augmented user interface elements in a VR headset, such as color-coded vectors, to indicate the environmental impact on the performance and function of the ROV. In addition, augmenting environmental feedback such as hydrodynamic forces are translated into patterned haptic stimuli via a haptic suit for indicating drift-inducing flows in the near field. A pilot case study was performed to verify the feasibility and effectiveness of the system design in a series of simulated ROV operation tasks.

    ROVs are widely used in subsea exploration and intervention tasks, playing a critical role in offshore inspection, installation, and maintenance activities. The innovative ROV teleoperation feedback and control system will lower the barrier for ROV pilot jobs.

     
    more » « less
  4. Underwater robots, including Remote Operating Vehicles (ROV) and Autonomous Underwater Vehicles (AUV), are currently used to support underwater missions that are either impossible or too risky to be performed by manned systems. In recent years the academia and robotic industry have paved paths for tackling technical challenges for ROV/AUV operations. The level of intelligence of ROV/AUV has increased dramatically because of the recent advances in low-power-consumption embedded computing devices and machine intelligence (e.g., AI). Nonetheless, operating precisely underwater is still extremely challenging to minimize human intervention due to the inherent challenges and uncertainties associated with the underwater environments. Proximity operations, especially those requiring precise manipulation, are still carried out by ROV systems that are fully controlled by a human pilot. A workplace-ready and worker-friendly ROV interface that properly simplifies operator control and increases remote operation confidence is the central challenge for the wide adaptation of ROVs.

    This paper examines the recent advances of virtual telepresence technologies as a solution for lowering the barriers to the human-in-the-loop ROV teleoperation. Virtual telepresence refers to Virtual Reality (VR) related technologies that help a user to feel that they were in a hazardous situation without being present at the actual location. We present a pilot system of using a VR-based sensory simulator to convert ROV sensor data into human-perceivable sensations (e.g., haptics). Building on a cloud server for real-time rendering in VR, a less trained operator could possibly operate a remote ROV thousand miles away without losing the minimum situational awareness. The system is expected to enable an intensive human engagement on ROV teleoperation, augmenting abilities for maneuvering and navigating ROV in unknown and less explored subsea regions and works. This paper also discusses the opportunities and challenges of this technology for ad hoc training, workforce preparation, and safety in the future maritime industry. We expect that lessons learned from our work can help democratize human presence in future subsea engineering works, by accommodating human needs and limitations to lower the entrance barrier.

     
    more » « less
  5. Tele-nursing robots provide a safe approach for patient-caring in quarantine areas. For effective nurse-robot collaboration, ergonomic teleoperation and intuitive interfaces with low physical and cognitive workload must be developed. We propose a framework to evaluate the control interfaces to iteratively develop an intuitive, efficient, and ergonomic teleoperation interface. The framework is a hierarchical procedure that incorporates general to specific assessment and its role in design evolution. We first present pre-defined objective and subjective metrics used to evaluate three representative contemporary teleoperation interfaces. The results indicate that teleoperation via human motion mapping outperforms the gamepad and stylus interfaces. The trade-off with using motion mapping as a teleoperation interface is the non-trivial physical fatigue. To understand the impact of heavy physical demand during motion mapping teleoperation, we propose an objective assessment of physical workload in teleoperation using electromyography (EMG). We find that physical fatigue happens in the actions that involve precise manipulation and steady posture maintenance. We further implemented teleoperation assistance in the form of shared autonomy to eliminate the fatigue-causing component in robot teleoperation via motion mapping. The experimental results show that the autonomous feature effectively reduces the physical effort while improving the efficiency and accuracy of the teleoperation interface. 
    more » « less