Abstract Drones are increasingly popular for collecting behaviour data of group‐living animals, offering inexpensive and minimally disruptive observation methods. Imagery collected by drones can be rapidly analysed using computer vision techniques to extract information, including behaviour classification, habitat analysis and identification of individual animals. While computer vision techniques can rapidly analyse drone‐collected data, the success of these analyses often depends on careful mission planning that considers downstream computational requirements—a critical factor frequently overlooked in current studies.We present a comprehensive summary of research in the growing AI‐driven animal ecology (ADAE) field, which integrates data collection with automated computational analysis focused on aerial imagery for collective animal behaviour studies. We systematically analyse current methodologies, technical challenges and emerging solutions in this field, from drone mission planning to behavioural inference. We illustrate computer vision pipelines that infer behaviour from drone imagery and present the computer vision tasks used for each step. We map specific computational tasks to their ecological applications, providing a framework for future research design.Our analysis reveals AI‐driven animal ecology studies for collective animal behaviour using drone imagery focus on detection and classification computer vision tasks. While convolutional neural networks (CNNs) remain dominant for detection and classification tasks, newer architectures like transformer‐based models and specialized video analysis networks (e.g. X3D, I3D, SlowFast) designed for temporal pattern recognition are gaining traction for pose estimation and behaviour inference. However, reported model accuracy varies widely by computer vision task, species, habitats and evaluation metrics, complicating meaningful comparisons between studies.Based on current trends, we conclude semi‐autonomous drone missions will be increasingly used to study collective animal behaviour. While manual drone operation remains prevalent, autonomous drone manoeuvrers, powered by edge AI, can scale and standardise collective animal behavioural studies while reducing the risk of disturbance and improving data quality. We propose guidelines for AI‐driven animal ecology drone studies adaptable to various computer vision tasks, species and habitats. This approach aims to collect high‐quality behaviour data while minimising disruption to the ecosystem. 
                        more » 
                        « less   
                    This content will become publicly available on March 10, 2026
                            
                            WildWing: An open‐source, autonomous and affordable UAS for animal behaviour video monitoring
                        
                    
    
            Abstract Drones have become invaluable tools for studying animal behaviour in the wild, enabling researchers to collect aerial video data of group‐living animals. However, manually piloting drones to track animal groups consistently is challenging due to complex factors such as terrain, vegetation, group spread and movement patterns. The variability in manual piloting can result in unusable data for downstream behavioural analysis, making it difficult to collect standardized datasets for studying collective animal behaviour.To address these challenges, we present WildWing, a complete hardware and software open‐source unmanned aerial system (UAS) for autonomously collecting behavioural video data of group‐living animals. The system's main goal is to automate and standardize the collection of high‐quality aerial footage suitable for computer vision‐based behaviour analysis. We provide a novel navigation policy to autonomously track animal groups while maintaining optimal camera angles and distances for behavioural analysis, reducing the inconsistencies inherent in manual piloting.The complete WildWing system costs only $650 and incorporates drone hardware with custom software that integrates ecological knowledge into autonomous navigation decisions. The system produces 4 K resolution video at 30 fps while automatically maintaining appropriate distances and angles for behaviour analysis. We validate the system through field deployments tracking groups of Grevy's zebras, giraffes and Przewalski's horses at The Wilds conservation centre, demonstrating its ability to collect usable behavioural data consistently.By automating the data collection process, WildWing helps ensure consistent, high‐quality video data suitable for computer vision analysis of animal behaviour. This standardization is crucial for developing robust automated behaviour recognition systems to help researchers study and monitor wildlife populations at scale. The open‐source nature of WildWing makes autonomous behavioural data collection more accessible to researchers, enabling wider application of drone‐based behavioural monitoring in conservation and ecological research. 
        more » 
        « less   
        
    
    
                            - PAR ID:
- 10588069
- Publisher / Repository:
- Methodis in Ecology and Evolution
- Date Published:
- Journal Name:
- Methods in Ecology and Evolution
- ISSN:
- 2041-210X
- Subject(s) / Keyword(s):
- imageomics UAV drones
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Abstract Unoccupied aerial vehicles (UAVs; drones) offer mobile platforms for ecological investigation, but can be impractical in some environments and the resulting noise can disturb wildlife.We developed a mobile alternative using a bird‐borne platform to record the behaviour of other animals in the field. This unit consists of a lightweight audio and video sensor that is carried by a trained Harris's hawkParabuteo unicinctus.We tested the hypothesis that our bird‐borne platform is a viable option for collecting behavioural data from mobile animals. We recorded acoustic and video data as the hawk flew through a dense group of Brazilian free‐tailed batsTadarida brasiliensisemerging from a cave, with a test case of investigating how echolocation calls change depending on spatial position in the bat group.The HawkEar platform is an alternative for collecting behavioural data when a mobile platform that is less noisy and restrictive than traditional UAVs is needed. The design and software are open source and can be modified to accommodate additional sensor needs.more » « less
- 
            Abstract Drones have emerged as a cost‐effective solution to detect and map plant invasions, offering researchers and land managers flexibility in flight design, sensors and data collection schedules. A systematic review of trends in drone‐based image collection, data processing and analytical approaches is needed to advance the science of invasive species monitoring and management and improve scalability and replicability.We systematically reviewed studies using drones for plant invasion research to identify knowledge gaps, best practices and a path toward advancing the science of invasive plant monitoring and management. We devised a database of 33 standardized reporting parameters, coded each study to those parameters, calculated descriptive statistics and synthesized how these technologies are being implemented and used.Trends show a general increase in studies since 2009 with a bias toward temperate regions in North America and Europe. Most studies have focused on testing the validity of a machine learning or deep learning image classification technique with fewer studies focused on monitoring or modelling spread. Very few studies used drones for assessing ecosystem dynamics and impacts such as determining environmental drivers or tracking re‐emergence after disturbance. Overall, we noted a lack of standardized reporting on field survey design, flight design, drone systems, image processing and analyses, which hinders replicability and scalability of approaches. Based on these findings, we develop a standard framework for drone applications in invasive species monitoring to foster cross‐study comparability and reproducibility.We suggest several areas for advancing the use of drones in invasive plant studies including (1) utilizing standardized reporting frameworks to facilitate scientific research practices, (2) integrating drone data with satellite imagery to scale up relationships over larger areas, (3) using drones as an alternative to in‐person ground surveys and (4) leveraging drones to assess community trait shifts tied to plant fitness and reproduction.more » « less
- 
            Abstract Camera traps (CTs) are a valuable tool in ecological research, amassing large quantities of information on the behaviour of diverse wildlife communities. CTs are predominantly used as passive data loggers to gather observational data for correlational analyses. Integrating CTs into experimental studies, however, can enable rigorous testing of key hypotheses in animal behaviour and conservation biology that are otherwise difficult or impossible to evaluate.We developed the 'BoomBox', an open‐source Arduino‐compatible board that attaches to commercially available CTs to form an Automated Behavioural Response (ABR) system. The modular unit connects directly to the CT’s passive infrared (PIR) motion sensor, playing audio files over external speakers when the sensor is triggered. This creates a remote playback system that captures animal responses to specific cues, combining the benefits of camera trapping (e.g. continuous monitoring in remote locations, lack of human observers, large data volume) with the power of experimental manipulations (e.g. controlled perturbations for strong mechanistic inference).Our system builds on previous ABR designs to provide a cheap (~100USD) and customizable field tool. We provide a practical guide detailing how to build and operate the BoomBox ABR system with suggestions for potential experimental designs that address a variety of questions in wildlife ecology. As proof‐of‐concept, we successfully field tested the BoomBox in two distinct field settings to study species interactions (predator–prey and predator–predator) and wildlife responses to conservation interventions.This new tool allows researchers to conduct a unique suite of manipulative experiments on free‐living species in complex environments, enhancing the ability to identify mechanistic drivers of species' behaviours and interactions in natural systems.more » « less
- 
            Abstract Collective motion, that is the coordinated spatial and temporal organisation of individuals, is a core element in the study of collective animal behaviour. The self‐organised properties of how a group moves influence its various behavioural and ecological processes, such as predator–prey dynamics, social foraging and migration. However, little is known about the inter‐ and intra‐specific variation in collective motion. Despite the significant advancement in high‐resolution tracking of multiple individuals within groups, providing collective motion data for animals in the laboratory and the field, a framework to perform quantitative comparisons across species and contexts is lacking.Here, we present theswaRmversepackage. Building on two existing R packages,trackdfandswaRm,swaRmverseenables the identification and analysis of collective motion ‘events’, as presented in Papadopoulou et al. (2023), creating a unit of comparison across datasets. We describe the package's structure and showcase its functionality using existing datasets from several species and simulated trajectories from an agent‐based model.From positional time‐series data for multiple individuals (x‐y‐t‐id),swaRmverseidentifies events of collective motion based on the distribution of polarisation and group speed. For each event, a suite of validated biologically meaningful metrics are calculated, and events are placed into a ‘swarm space’ through dimensional reduction techniques.Our package provides the first automated pipeline enabling the analysis of data on collective behaviour. The package allows the calculation and use of complex metrics for users without a strong quantitative background and will promote communication and data‐sharing across disciplines, standardising the quantification of collective motion across species and promoting comparative investigations.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
