skip to main content


Title: Visual Tracking of Deepwater Animals Using Machine Learning-Controlled Robotic Underwater Vehicles
The ocean is a vast three-dimensional space that is poorly explored and understood, and harbors unobserved life and processes that are vital to ecosystem function. To fully interrogate the space, novel algorithms and robotic platforms are required to scale up observations. Locating animals of interest and extended visual observations in the water column are particularly challenging objectives. Towards that end, we present a novel Machine Learning-integrated Tracking (or ML-Tracking) algorithm for underwater vehicle control that builds on the class of algorithms known as tracking-by-detection. By coupling a multi-object detector (trained on in situ underwater image data), a 3D stereo tracker, and a supervisor module to oversee the mission, we show how ML-Tracking can create robust tracks needed for long duration observations, as well as enable fully automated acquisition of objects for targeted sampling. Using a remotely operated vehicle as a proxy for an autonomous underwater vehicle, we demonstrate continuous input from the ML-Tracking algorithm to the vehicle controller during a record, 5+ hr continuous observation of a midwater gelatinous animal known as a siphonophore. These efforts clearly demonstrate the potential that tracking-by-detection algorithms can have on exploration in unexplored environments and discovery of undiscovered life in our ocean.  more » « less
Award ID(s):
1812535
PAR ID:
10217238
Author(s) / Creator(s):
; ; ; ; ; ; ; ;
Date Published:
Journal Name:
IEEE Winter Conference on Applications of Computer Vision
ISSN:
2472-6796
Page Range / eLocation ID:
860-869
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    In-situ visual observations of marine organisms is crucial to developing behavioural understandings and their relations to their surrounding ecosystem. Typically, these observations are collected via divers, tags, and remotely-operated or human-piloted vehicles. Recently, however, autonomous underwater vehicles equipped with cameras and embedded computers with GPU capabilities are being developed for a variety of applications, and in particular, can be used to supplement these existing data collection mechanisms where human operation or tags are more difficult. Existing approaches have focused on using fully-supervised tracking methods, but labelled data for many underwater species are severely lacking. Semi-supervised trackers may offer alternative tracking solutions because they require less data than fully-supervised counterparts. However, because there are not existing realistic underwater tracking datasets, the performance of semi-supervised tracking algorithms in the marine domain is not well understood. To better evaluate their performance and utility, in this paper we provide (1) a novel dataset specific to marine animals located athttp://warp.whoi.edu/vmat/, (2) an evaluation of state-of-the-art semi-supervised algorithms in the context of underwater animal tracking, and (3) an evaluation of real-world performance through demonstrations using a semi-supervised algorithm on-board an autonomous underwater vehicle to track marine animals in the wild.

     
    more » « less
  2. Abstract

    Imaging underwater environments is of great importance to marine sciences, sustainability, climatology, defense, robotics, geology, space exploration, and food security. Despite advances in underwater imaging, most of the ocean and marine organisms remain unobserved and undiscovered. Existing methods for underwater imaging are unsuitable for scalable, long-term, in situ observations because they require tethering for power and communication. Here we describe underwater backscatter imaging, a method for scalable, real-time wireless imaging of underwater environments using fully-submerged battery-free cameras. The cameras power up from harvested acoustic energy, capture color images using ultra-low-power active illumination and a monochrome image sensor, and communicate wirelessly at net-zero-power via acoustic backscatter. We demonstrate wireless battery-free imaging of animals, plants, pollutants, and localization tags in enclosed and open-water environments. The method’s self-sustaining nature makes it desirable for massive, continuous, and long-term ocean deployments with many applications including marine life discovery, submarine surveillance, and underwater climate change monitoring.

     
    more » « less
  3. Model-based approaches to navigation, control, and fault detection that utilize precise nonlinear models of vehicle plant dynamics will enable more accurate control and navigation, assured autonomy, and more complex missions for such vehicles. This paper reports novel theoretical and experimental results addressing the problem of parameter estimation of plant and actuator models for underactuated underwater vehicles operating in 6 degrees-of-freedom (DOF) whose dynamics are modeled by finite-dimensional Newton-Euler equations. This paper reports the first theoretical approach and experimental validation to identify simultaneously plant-model parameters (parameters such as mass, added mass, hydrodynamic drag, and buoyancy) and control-actuator parameters (control-surface models and thruster models) in 6-DOF. Most previously reported studies on parameter identification assume that the control-actuator parameters are known a priori. Moreover, this paper reports the first proof of convergence of the parameter estimates to the true set of parameters for this class of vehicles under a persistence of excitation condition. The reported adaptive identification (AID) algorithm does not require instrumentation of 6-DOF vehicle acceleration, which is required by conventional approaches to parameter estimation such as least squares. Additionally, the reported AID algorithm is applicable under any arbitrary open-loop or closed-loop control law. We report simulation and experimental results for identifying the plant-model and control-actuator parameters for an L3 OceanServer Iver3 autonomous underwater vehicle. We believe this general approach to AID could be extended to apply to other classes of machines and other classes of marine, land, aerial, and space vehicles.

     
    more » « less
  4. Abstract—Current state-of-the-art object tracking methods have largely benefited from the public availability of numerous benchmark datasets. However, the focus has been on open-air imagery and much less on underwater visual data. Inherent underwater distortions, such as color loss, poor contrast, and underexposure, caused by attenuation of light, refraction, and scattering, greatly affect the visual quality of underwater data, and as such, existing open-air trackers perform less efficiently on such data. To help bridge this gap, this article proposes a first comprehensive underwater object tracking (UOT100) benchmark dataset to facilitate the development of tracking algorithms well-suited for underwater environments. The proposed dataset consists of 104 underwater video sequences and more than 74 000 annotated frames derived from both natural and artificial underwater videos, with great varieties of distortions. We benchmark the performance of 20 state-of-the-art object tracking algorithms and further introduce a cascaded residual network for underwater image enhancement model to improve tracking accuracy and success rate of trackers. Our experimental results demonstrate the shortcomings of existing tracking algorithms on underwater data and how our generative adversarial network (GAN)-based enhancement model can be used to improve tracking performance. We also evaluate the visual quality of our model’s output against existing GAN-based methods using well-accepted quality metrics and demonstrate that our model yields better visual data. Index Terms—Underwater benchmark dataset, underwater generative adversarial network (GAN), underwater image enhancement (UIE), underwater object tracking (UOT). 
    more » « less
  5. The deep chlorophyll maximum (DCM) layer is an ecologically important feature of the open ocean. The DCM cannot be observed using aerial or satellite remote sensing; thus, in situ observations are essential. Further, understanding the responses of microbes to the environmental processes driving their metabolism and interactions requires observing in a reference frame that moves with a plankton population drifting in ocean currents, i.e., Lagrangian. Here, we report the development and application of a system of coordinated robots for studying planktonic biological communities drifting within the ocean. The presented Lagrangian system uses three coordinated autonomous robotic platforms. The focal platform consists of an autonomous underwater vehicle (AUV) fitted with a robotic water sampler. This platform localizes and drifts within a DCM community, periodically acquiring samples while continuously monitoring the local environment. The second platform is an AUV equipped with environmental sensing and acoustic tracking capabilities. This platform characterizes environmental conditions by tracking the focal platform and vertically profiling in its vicinity. The third platform is an autonomous surface vehicle equipped with satellite communications and subsea acoustic tracking capabilities. While also acoustically tracking the focal platform, this vehicle serves as a communication relay that connects the subsea robot to human operators, thereby providing situational awareness and enabling intervention if needed. Deployed in the North Pacific Ocean within the core of a cyclonic eddy, this coordinated system autonomously captured fundamental characteristics of the in situ DCM microbial community in a manner not possible previously.

     
    more » « less