Abstract Site occupancy models (SOMs) are a common tool for studying the spatial ecology of wildlife. When observational data are collected using passive monitoring field methods, including camera traps or autonomous recorders, detections of animals may be temporally autocorrelated, leading to biased estimates and incorrectly quantified uncertainty. We presently lack clear guidance for understanding and mitigating the consequences of temporal autocorrelation when estimating occupancy models with camera trap data.We use simulations to explore when and how autocorrelation gives rise to biased or overconfident estimates of occupancy. We explore the impact of sampling design and biological conditions on model performance in the presence of autocorrelation, investigate the usefulness of several techniques for identifying and mitigating bias and compare performance of the SOM to a model that explicitly estimates autocorrelation. We also conduct a case study using detections of 22 North American mammals.We show that a join count goodness‐of‐fit test previously proposed for identifying clustered detections is effective for detecting autocorrelation across a range of conditions. We find that strong bias occurs in the estimated occupancy intercept when survey durations are short and detection rates are low. We provide a reference table for assessing the degree of bias to be expected under all conditions. We further find that discretizing data with larger windows decreases the magnitude of bias introduced by autocorrelation. In our case study, we find that detections of most species are autocorrelated and demonstrate how larger detection windows might mitigate the resulting bias.Our findings suggest that autocorrelation is likely widespread in camera trap data and that many previous studies of occupancy based on camera trap data may have systematically underestimated occupancy probabilities. Moving forward, we recommend that ecologists estimating occupancy from camera trap data use the join count goodness‐of‐fit test to determine whether autocorrelation is present in their data. If it is, SOMs should use large detection windows to mitigate bias and more accurately quantify uncertainty in occupancy model parameters. Ecologists should not use gaps between detection periods, which are ineffective at mitigating temporal structure in data and discard useful data.
more »
« less
Estimating animal size or distance in camera trap images: Photogrammetry using the pinhole camera model
Abstract As camera trapping has become a standard practice in wildlife ecology, developing techniques to extract additional information from images will increase the utility of generated data. Despite rapid advancements in camera trapping practices, methods for estimating animal size or distance from the camera using captured images have not been standardized. Deriving animal sizes directly from images creates opportunities to collect wildlife metrics such as growth rates or changes in body condition. Distances to animals may be used to quantify important aspects of sampling design such as the effective area sampled or distribution of animals in the camera's field‐of‐view.We present a method of using pixel measurements in an image to estimate animal size or distance from the camera using a conceptual model in photogrammetry known as the ‘pinhole camera model’. We evaluated the performance of this approach both using stationary three‐dimensional animal targets and in a field setting using live captive reindeerRangifer tarandusranging in size and distance from the camera.We found total mean relative error of estimated animal sizes or distances from the cameras in our simulation was −3.0% and 3.3% and in our field setting was −8.6% and 10.5%, respectively. In our simulation, mean relative error of size or distance estimates were not statistically different between image settings within camera models, between camera models or between the measured dimension used in calculations.We provide recommendations for applying the pinhole camera model in a wildlife camera trapping context. Our approach of using the pinhole camera model to estimate animal size or distance from the camera produced robust estimates using a single image while remaining easy to implement and generalizable to different camera trap models and installations, thus enhancing its utility for a variety of camera trap applications and expanding opportunities to use camera trap images in novel ways.
more »
« less
- Award ID(s):
- 1839192
- PAR ID:
- 10446023
- Publisher / Repository:
- Wiley-Blackwell
- Date Published:
- Journal Name:
- Methods in Ecology and Evolution
- Volume:
- 13
- Issue:
- 8
- ISSN:
- 2041-210X
- Page Range / eLocation ID:
- p. 1707-1718
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Camera traps (CTs) are a valuable tool in ecological research, amassing large quantities of information on the behaviour of diverse wildlife communities. CTs are predominantly used as passive data loggers to gather observational data for correlational analyses. Integrating CTs into experimental studies, however, can enable rigorous testing of key hypotheses in animal behaviour and conservation biology that are otherwise difficult or impossible to evaluate.We developed the 'BoomBox', an open‐source Arduino‐compatible board that attaches to commercially available CTs to form an Automated Behavioural Response (ABR) system. The modular unit connects directly to the CT’s passive infrared (PIR) motion sensor, playing audio files over external speakers when the sensor is triggered. This creates a remote playback system that captures animal responses to specific cues, combining the benefits of camera trapping (e.g. continuous monitoring in remote locations, lack of human observers, large data volume) with the power of experimental manipulations (e.g. controlled perturbations for strong mechanistic inference).Our system builds on previous ABR designs to provide a cheap (~100USD) and customizable field tool. We provide a practical guide detailing how to build and operate the BoomBox ABR system with suggestions for potential experimental designs that address a variety of questions in wildlife ecology. As proof‐of‐concept, we successfully field tested the BoomBox in two distinct field settings to study species interactions (predator–prey and predator–predator) and wildlife responses to conservation interventions.This new tool allows researchers to conduct a unique suite of manipulative experiments on free‐living species in complex environments, enhancing the ability to identify mechanistic drivers of species' behaviours and interactions in natural systems.more » « less
-
Abstract Camera traps deployed in grids or stratified random designs are a well‐established survey tool for wildlife but there has been little evaluation of study design parameters.We used an empirical subsampling approach involving 2,225 camera deployments run at 41 study areas around the world to evaluate three aspects of camera trap study design (number of sites, duration and season of sampling) and their influence on the estimation of three ecological metrics (species richness, occupancy and detection rate) for mammals.We found that 25–35 camera sites were needed for precise estimates of species richness, depending on scale of the study. The precision of species‐level estimates of occupancy (ψ) was highly sensitive to occupancy level, with <20 camera sites needed for precise estimates of common (ψ > 0.75) species, but more than 150 camera sites likely needed for rare (ψ < 0.25) species. Species detection rates were more difficult to estimate precisely at the grid level due to spatial heterogeneity, presumably driven by unaccounted habitat variability factors within the study area. Running a camera at a site for 2 weeks was most efficient for detecting new species, but 3–4 weeks were needed for precise estimates of local detection rate, with no gains in precision observed after 1 month. Metrics for all mammal communities were sensitive to seasonality, with 37%–50% of the species at the sites we examined fluctuating significantly in their occupancy or detection rates over the year. This effect was more pronounced in temperate sites, where seasonally sensitive species varied in relative abundance by an average factor of 4–5, and some species were completely absent in one season due to hibernation or migration.We recommend the following guidelines to efficiently obtain precise estimates of species richness, occupancy and detection rates with camera trap arrays: run each camera for 3–5 weeks across 40–60 sites per array. We recommend comparisons of detection rates be model based and include local covariates to help account for small‐scale variation. Furthermore, comparisons across study areas or times must account for seasonality, which could have strong impacts on mammal communities in both tropical and temperate sites.more » « less
-
There is an urgent need to develop global observation networks to quantify biodiversity trends for evaluating achievements of targets of Kunming-Montreal Global Biodiversity Framework. Camera traps are a commonly used tool, with the potential to enhance global observation networks for monitoring wildlife population trends and has the capacity to constitute global observation networks by applying a unified sampling protocol. The Snapshot protocol is simple and easy for camera trapping which is applied in North America and Europe. However, there is no regional camera-trap network with the Snapshot protocol in Asia. We present the first dataset from a collaborative camera-trap survey using the Snapshot protocol in Japan conducted in 2023. We collected data at 90 locations across nine arrays for a total of 6162 trap-nights of survey effort. The total number of sequences with mammals and birds was 7967, including 20 mammal species and 23 avian species. Apart from humans, wild boar, sika deer and rodents were the most commonly observed taxa on the camera traps, covering 57.9% of all the animal individuals. We provide the dataset with a standard format of Wildlife Insights, but also with Camtrap DP 1.0 format. Our dataset can be used for a part of the global dataset for comparing relative abundances of wildlife and for a baseline of wildlife population trends in Japan. It can also used for training machine-learning models for automatic species identifications.more » « less
-
Abstract Drones have become invaluable tools for studying animal behaviour in the wild, enabling researchers to collect aerial video data of group‐living animals. However, manually piloting drones to track animal groups consistently is challenging due to complex factors such as terrain, vegetation, group spread and movement patterns. The variability in manual piloting can result in unusable data for downstream behavioural analysis, making it difficult to collect standardized datasets for studying collective animal behaviour.To address these challenges, we present WildWing, a complete hardware and software open‐source unmanned aerial system (UAS) for autonomously collecting behavioural video data of group‐living animals. The system's main goal is to automate and standardize the collection of high‐quality aerial footage suitable for computer vision‐based behaviour analysis. We provide a novel navigation policy to autonomously track animal groups while maintaining optimal camera angles and distances for behavioural analysis, reducing the inconsistencies inherent in manual piloting.The complete WildWing system costs only $650 and incorporates drone hardware with custom software that integrates ecological knowledge into autonomous navigation decisions. The system produces 4 K resolution video at 30 fps while automatically maintaining appropriate distances and angles for behaviour analysis. We validate the system through field deployments tracking groups of Grevy's zebras, giraffes and Przewalski's horses at The Wilds conservation centre, demonstrating its ability to collect usable behavioural data consistently.By automating the data collection process, WildWing helps ensure consistent, high‐quality video data suitable for computer vision analysis of animal behaviour. This standardization is crucial for developing robust automated behaviour recognition systems to help researchers study and monitor wildlife populations at scale. The open‐source nature of WildWing makes autonomous behavioural data collection more accessible to researchers, enabling wider application of drone‐based behavioural monitoring in conservation and ecological research.more » « less
An official website of the United States government
