Underwater robots, including Remote Operating Vehicles (ROV) and Autonomous Underwater Vehicles (AUV), are currently used to support underwater missions that are either impossible or too risky to be performed by manned systems. In recent years the academia and robotic industry have paved paths for tackling technical challenges for ROV/AUV operations. The level of intelligence of ROV/AUV has increased dramatically because of the recent advances in low-power-consumption embedded computing devices and machine intelligence (e.g., AI). Nonetheless, operating precisely underwater is still extremely challenging to minimize human intervention due to the inherent challenges and uncertainties associated with the underwater environments. Proximity operations, especially those requiring precise manipulation, are still carried out by ROV systems that are fully controlled by a human pilot. A workplace-ready and worker-friendly ROV interface that properly simplifies operator control and increases remote operation confidence is the central challenge for the wide adaptation of ROVs. This paper examines the recent advances of virtual telepresence technologies as a solution for lowering the barriers to the human-in-the-loop ROV teleoperation. Virtual telepresence refers to Virtual Reality (VR) related technologies that help a user to feel that they were in a hazardous situation without being present at the actual location. We present a pilot system of using a VR-based sensory simulator to convert ROV sensor data into human-perceivable sensations (e.g., haptics). Building on a cloud server for real-time rendering in VR, a less trained operator could possibly operate a remote ROV thousand miles away without losing the minimum situational awareness. The system is expected to enable an intensive human engagement on ROV teleoperation, augmenting abilities for maneuvering and navigating ROV in unknown and less explored subsea regions and works. This paper also discusses the opportunities and challenges of this technology for ad hoc training, workforce preparation, and safety in the future maritime industry. We expect that lessons learned from our work can help democratize human presence in future subsea engineering works, by accommodating human needs and limitations to lower the entrance barrier. 
                        more » 
                        « less   
                    
                            
                            Siphonophore specimens collected for the SiphWeb project from two ROVs during the R/V Western Flyer MBARI DEEPC cruises in 2019-2022
                        
                    
    
            This dataset includes information about siphonophore specimens collected by ROV Doc Ricketts and ROV Ventana during deployments that were conducted from the MBARI ship R/V Western Flyer, in 2019-2022. The data include the species or lowest classification possible along with the date, time, depth, and temperature where the organism was observed. 
        more » 
        « less   
        
    
                            - Award ID(s):
 - 1829805
 
- PAR ID:
 - 10366081
 
- Publisher / Repository:
 - Biological and Chemical Oceanography Data Management Office (BCO-DMO)
 
- Date Published:
 
- Edition / Version:
 - 3
 
- Format(s):
 - Medium: X
 
- Sponsoring Org:
 - National Science Foundation
 
More Like this
- 
            
 - 
            Abstract ROV operations are mainly performed via a traditional control kiosk and limited data feedback methods, such as the use of joysticks and camera view displays equipped on a surface vessel. This traditional setup requires significant personnel on board (POB) time and imposes high requirements for personnel training. This paper proposes a virtual reality (VR) based haptic-visual ROV teleoperation system that can substantially simplify ROV teleoperation and enhance the remote operator's situational awareness. This study leverages the recent development in Mixed Reality (MR) technologies, sensory augmentation, sensing technologies, and closed-loop control, to visualize and render complex underwater environmental data in an intuitive and immersive way. The raw sensor data will be processed with physics engine systems and rendered as a high-fidelity digital twin model in game engines. Certain features will be visualized and displayed via the VR headset, whereas others will be manifested as haptic and tactile cues via our haptic feedback systems. We applied a simulation approach to test the developed system. With our developed system, a high-fidelity subsea environment is reconstructed based on the sensor data collected from an ROV including the bathymetric, hydrodynamic, visual, and vehicle navigational measurements. Specifically, the vehicle is equipped with a navigation sensor system for real-time state estimation, an acoustic Doppler current profiler for far-field flow measurement, and a bio-inspired artificial literal-line hydrodynamic sensor system for near-field small-scale hydrodynamics. Optimized game engine rendering algorithms then visualize key environmental features as augmented user interface elements in a VR headset, such as color-coded vectors, to indicate the environmental impact on the performance and function of the ROV. In addition, augmenting environmental feedback such as hydrodynamic forces are translated into patterned haptic stimuli via a haptic suit for indicating drift-inducing flows in the near field. A pilot case study was performed to verify the feasibility and effectiveness of the system design in a series of simulated ROV operation tasks. ROVs are widely used in subsea exploration and intervention tasks, playing a critical role in offshore inspection, installation, and maintenance activities. The innovative ROV teleoperation feedback and control system will lower the barrier for ROV pilot jobs.more » « less
 - 
            Underwater ROVs (Remotely Operated Vehicles) are unmanned submersibles designed for exploring and operating in the depths of the ocean. Despite using high-end cameras, typical teleoperation engines based on first-person (egocentric) views limit a surface operator’s ability to maneuver the ROV in complex deep-water missions. In this paper, we present an interactive teleoperation interface that enhances the operational capabilities via increased situational awareness. This is accomplished by (i) offering on-demand third-person (exocentric) visuals from past egocentric views, and (ii) facilitating enhanced peripheral information with augmented ROV pose in real-time. We achieve this by integrating a 3D geometry-based Ego-to-Exo view synthesis algorithm into a monocular SLAM system for accurate trajectory estimation. The proposed closed-form solution only uses past egocentric views from the ROV and a SLAM backbone for pose estimation, which makes it portable to existing ROV platforms. Unlike data-driven solutions, it is invariant to applications and waterbody-specific scenes. We validate the geometric accuracy of the proposed framework through extensive experiments of 2-DOF indoor navigation and 6-DOF underwater cave exploration in challenging low-light conditions. A subjective evaluation on 15 human teleoperators further confirms the effectiveness of the integrated features for improved teleoperation. We demonstrate the benefits of dynamic Ego-to-Exo view generation and real-time pose rendering for remote ROV teleoperation by following navigation guides such as cavelines inside underwater caves. This new way of interactive ROV teleoperation opens up promising opportunities for future research in subsea telerobotics.more » « less
 - 
            Our perception of deep-sea communities has evolved as various sampling approaches have captured different components of deep-sea habitats. We sampled midwater zooplankton assemblages in Monterey Bay, California to quantify community composition (abundance and biomass) and biodiversity (at the Order level) across three depth ranges, and the effects of sampling methodology on community parameters. We collected zooplankton using two types of opening-closing trawls [Tucker Trawl and Multiple Opening/Closing Net and Environmental Sensing System (MOCNESS)] and video recordings from a remotely operated vehicle (ROV). We quantified the relative contributions of microbes to community biomass using synoptic water-bottle casts and flow cytometry. Overall, the pelagic community was most similar between the Tucker trawl and ROV (dissimilarity = 52.4%) and least similar between the MOCNESS and ROV (dissimilarity = 65.8%). Dissimilarity between sampling methods was driven by the relative abundances of crustaceans and gelatinous taxa, where gelatinous animals (cnidarians, ctenophores, tunicates) were more abundant in ROV surveys (64.2%) and Tucker trawls (46.8%) compared to MOCNESS samples (14.5%). ROV surveys were the only method that sufficiently documented the most physically delicate taxa (e.g., physonect siphonophores, lobate ctenophores, and larvaceans). Biomass was also one order of magnitude lower in MOCNESS trawls compared to Tucker trawls. Due to these large differences, the relative contributions of microbes to total biomass were substantially lower in Tucker trawl samples (mean = 7.5%) compared to MOCNESS samples (mean = 51%). These results illustrate that our view of planktonic composition and community biomass is strongly dependent on sampling methodology.more » « less
 - 
            In this report, we present the system design, operational strategy, and results of coordinated multivehicle field demonstrations of autonomous marine robotic technologies in search-for-life missions within the Pacific shelf margin of Costa Rica and the Santorini-Kolumbo caldera complex, which serve as analogs to environments that may exist in oceans beyond Earth. This report focuses on the automation of remotely operated vehicle (ROV) manipulator operations for targeted biological sample-collection-and-return from the seafloor. In the context of future extraterrestrial exploration missions to ocean worlds, an ROV is an analog to a planetary lander, which must be capable of high-level autonomy. Our field trials involve two underwater vehicles, the SuBastian ROV and the Nereid Under Ice (NUI) hybrid ROV for mixed initiative (i.e., teleoperated or autonomous) missions, both equipped seven-degrees-of-freedom hydraulic manipulators. We describe an adaptable, hardware-independent computer vision architecture that enables high-level automated manipulation. The vision system provides a three-dimensional understanding of the workspace to inform manipulator motion planning in complex unstructured environments. We demonstrate the effectiveness of the vision system and control framework through field trials in increasingly challenging environments, including the automated collection and return of biological samples from within the active undersea volcano Kolumbo. Based on our experiences in the field, we discuss the performance of our system and identify promising directions for future research.more » « less
 
An official website of the United States government 
				
			