skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Studying collective animal behaviour with drones and computer vision
Abstract Drones are increasingly popular for collecting behaviour data of group‐living animals, offering inexpensive and minimally disruptive observation methods. Imagery collected by drones can be rapidly analysed using computer vision techniques to extract information, including behaviour classification, habitat analysis and identification of individual animals. While computer vision techniques can rapidly analyse drone‐collected data, the success of these analyses often depends on careful mission planning that considers downstream computational requirements—a critical factor frequently overlooked in current studies.We present a comprehensive summary of research in the growing AI‐driven animal ecology (ADAE) field, which integrates data collection with automated computational analysis focused on aerial imagery for collective animal behaviour studies. We systematically analyse current methodologies, technical challenges and emerging solutions in this field, from drone mission planning to behavioural inference. We illustrate computer vision pipelines that infer behaviour from drone imagery and present the computer vision tasks used for each step. We map specific computational tasks to their ecological applications, providing a framework for future research design.Our analysis reveals AI‐driven animal ecology studies for collective animal behaviour using drone imagery focus on detection and classification computer vision tasks. While convolutional neural networks (CNNs) remain dominant for detection and classification tasks, newer architectures like transformer‐based models and specialized video analysis networks (e.g. X3D, I3D, SlowFast) designed for temporal pattern recognition are gaining traction for pose estimation and behaviour inference. However, reported model accuracy varies widely by computer vision task, species, habitats and evaluation metrics, complicating meaningful comparisons between studies.Based on current trends, we conclude semi‐autonomous drone missions will be increasingly used to study collective animal behaviour. While manual drone operation remains prevalent, autonomous drone manoeuvrers, powered by edge AI, can scale and standardise collective animal behavioural studies while reducing the risk of disturbance and improving data quality. We propose guidelines for AI‐driven animal ecology drone studies adaptable to various computer vision tasks, species and habitats. This approach aims to collect high‐quality behaviour data while minimising disruption to the ecosystem.  more » « less
Award ID(s):
2112606
PAR ID:
10629972
Author(s) / Creator(s):
 ;  ;  ;  ;  ;  ;  ;  ;  ;  
Publisher / Repository:
Wiley-Blackwell
Date Published:
Journal Name:
Methods in Ecology and Evolution
Volume:
16
Issue:
10
ISSN:
2041-210X
Format(s):
Medium: X Size: p. 2229-2259
Size(s):
p. 2229-2259
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Drones have become invaluable tools for studying animal behaviour in the wild, enabling researchers to collect aerial video data of group‐living animals. However, manually piloting drones to track animal groups consistently is challenging due to complex factors such as terrain, vegetation, group spread and movement patterns. The variability in manual piloting can result in unusable data for downstream behavioural analysis, making it difficult to collect standardized datasets for studying collective animal behaviour.To address these challenges, we present WildWing, a complete hardware and software open‐source unmanned aerial system (UAS) for autonomously collecting behavioural video data of group‐living animals. The system's main goal is to automate and standardize the collection of high‐quality aerial footage suitable for computer vision‐based behaviour analysis. We provide a novel navigation policy to autonomously track animal groups while maintaining optimal camera angles and distances for behavioural analysis, reducing the inconsistencies inherent in manual piloting.The complete WildWing system costs only $650 and incorporates drone hardware with custom software that integrates ecological knowledge into autonomous navigation decisions. The system produces 4 K resolution video at 30 fps while automatically maintaining appropriate distances and angles for behaviour analysis. We validate the system through field deployments tracking groups of Grevy's zebras, giraffes and Przewalski's horses at The Wilds conservation centre, demonstrating its ability to collect usable behavioural data consistently.By automating the data collection process, WildWing helps ensure consistent, high‐quality video data suitable for computer vision analysis of animal behaviour. This standardization is crucial for developing robust automated behaviour recognition systems to help researchers study and monitor wildlife populations at scale. The open‐source nature of WildWing makes autonomous behavioural data collection more accessible to researchers, enabling wider application of drone‐based behavioural monitoring in conservation and ecological research. 
    more » « less
  2. Abstract Drones have emerged as a cost‐effective solution to detect and map plant invasions, offering researchers and land managers flexibility in flight design, sensors and data collection schedules. A systematic review of trends in drone‐based image collection, data processing and analytical approaches is needed to advance the science of invasive species monitoring and management and improve scalability and replicability.We systematically reviewed studies using drones for plant invasion research to identify knowledge gaps, best practices and a path toward advancing the science of invasive plant monitoring and management. We devised a database of 33 standardized reporting parameters, coded each study to those parameters, calculated descriptive statistics and synthesized how these technologies are being implemented and used.Trends show a general increase in studies since 2009 with a bias toward temperate regions in North America and Europe. Most studies have focused on testing the validity of a machine learning or deep learning image classification technique with fewer studies focused on monitoring or modelling spread. Very few studies used drones for assessing ecosystem dynamics and impacts such as determining environmental drivers or tracking re‐emergence after disturbance. Overall, we noted a lack of standardized reporting on field survey design, flight design, drone systems, image processing and analyses, which hinders replicability and scalability of approaches. Based on these findings, we develop a standard framework for drone applications in invasive species monitoring to foster cross‐study comparability and reproducibility.We suggest several areas for advancing the use of drones in invasive plant studies including (1) utilizing standardized reporting frameworks to facilitate scientific research practices, (2) integrating drone data with satellite imagery to scale up relationships over larger areas, (3) using drones as an alternative to in‐person ground surveys and (4) leveraging drones to assess community trait shifts tied to plant fitness and reproduction. 
    more » « less
  3. Abstract Understanding animal movement often relies upon telemetry and biologging devices. These data are frequently used to estimate latent behavioural states to help understand why animals move across the landscape. While there are a variety of methods that make behavioural inferences from biotelemetry data, some features of these methods (e.g. analysis of a single data stream, use of parametric distributions) may limit their generality to reliably discriminate among behavioural states.To address some of the limitations of existing behavioural state estimation models, we introduce a nonparametric Bayesian framework called the mixed‐membership method for movement (M4), which is available within the open‐sourcebayesmoveR package. This framework can analyse multiple data streams (e.g. step length, turning angle, acceleration) without relying on parametric distributions, which may capture complex behaviours more successfully than current methods. We tested our Bayesian framework using simulated trajectories and compared model performance against two segmentation methods (behavioural change point analysis (BCPA) and segclust2d), one machine learning method [expectation‐maximization binary clustering (EMbC)] and one type of state‐space model [hidden Markov model (HMM)]. We also illustrated this Bayesian framework using movements of juvenile snail kitesRostrhamus sociabilisin Florida, USA.The Bayesian framework estimated breakpoints more accurately than the other segmentation methods for tracks of different lengths. Likewise, the Bayesian framework provided more accurate estimates of behaviour than the other state estimation methods when simulations were generated from less frequently considered distributions (e.g. truncated normal, beta, uniform). Three behavioural states were estimated from snail kite movements, which were labelled as ‘encamped’, ‘area‐restricted search’ and ‘transit’. Changes in these behaviours over time were associated with known dispersal events from the nest site, as well as movements to and from possible breeding locations.Our nonparametric Bayesian framework estimated behavioural states with comparable or superior accuracy compared to the other methods when step lengths and turning angles of simulations were generated from less frequently considered distributions. Since the most appropriate parametric distributions may not be obvious a priori, methods (such as M4) that are agnostic to the underlying distributions can provide powerful alternatives to address questions in movement ecology. 
    more » « less
  4. Abstract Understanding why animals (including humans) choose one thing over another is one of the key questions underlying the fields of behavioural ecology, behavioural economics and psychology. Most traditional studies of food choice in animals focus on simple, single‐attribute decision tasks. However, animals in the wild are often faced with multi‐attribute choice tasks where options in the choice set vary across multiple dimensions. Multi‐attribute decision‐making is particularly relevant for flower‐visiting insects faced with deciding between flowers that may differ in reward attributes such as sugar concentration, nectar volume and pollen composition as well as non‐rewarding attributes such as colour, symmetry and odour. How do flower‐visiting insects deal with complex multi‐attribute decision tasks?Here we review and synthesise research on the decision strategies used by flower‐visiting insects when making multi‐attribute decisions. In particular, we review how different types of foraging frameworks (classic optimal foraging theory, nutritional ecology, heuristics) conceptualise multi‐attribute choice and we discuss how phenomena such as innate preferences, flower constancy and context dependence influence our understanding of flower choice.We find that multi‐attribute decision‐making is a complex process that can be influenced by innate preferences, flower constancy, the composition of the choice set and economic reward value. We argue that to understand and predict flower choice in flower‐visiting insects, we need to move beyond simplified choice sets towards a view of multi‐attribute choice which integrates the role of non‐rewarding attributes and which includes flower constancy, innate preferences and context dependence. We further caution that behavioural experiments need to consider the possibility of context dependence in the design and interpretation of preference experiments.We conclude with a discussion of outstanding questions for future research. We also present a conceptual framework that incorporates the multiple dimensions of choice behaviour. 
    more » « less
  5. Abstract Characterising the frequency and timing of biological processes such as locomotion, eclosion or foraging, is often needed to get a complete picture of a species' ecology. Automated trackers are an invaluable tool for high‐throughput collection of activity data and have become more accurate and efficient with advances in computer vision and deep learning. However, tracking activity of small and fast flying animals remains a hurdle, especially in a field setting with variable light conditions. Commercial activity monitors can be expensive, closed source and generally limited to laboratory settings.Here, we present a portable locomotion activity monitor (pLAM), a mobile activity detector to quantify small animal activity. Our setup uses inexpensive components, builds upon open‐source motion tracking software, and is easy to assemble and use in the field. It runs off‐grid, supports low‐light tracking with infrared lights and can implement arbitrary light cycle colours and brightnesses with programmable LEDs. We provide a user‐friendly guide to assembling pLAM hardware, accessing its pre‐configured software and guidelines for using it in other systems.We benchmarked pLAM for insects under various laboratory and field conditions, then compared results to a commercial activity detector. They offer broadly similar activity measures, but our setup captures flight and bouts of motion that are often missed by beam breaking activity detection.pLAM can automate laboratory and field monitoring of activity and timing in a wide range of biological processes, including circadian rhythm, eclosion and diapause timing, pollination and flower foraging, or pest feeding activity. This low cost and easy setup allows high‐throughput animal behaviour studies for basic and applied ecology and evolution research. 
    more » « less