Abstract The challenges of monitoring wildlife often limit the scales and intensity of the data that can be collected. New technologies—such as remote sensing using unoccupied aircraft systems (UASs)—can collect information more quickly, over larger areas, and more frequently than is feasible using ground‐based methods. While airborne imaging is increasingly used to produce data on the location and counts of individuals, its ability to produce individual‐based demographic information is less explored. Repeat airborne imagery to generate an imagery time series provides the potential to track individuals over time to collect information beyond one‐off counts, but doing so necessitates automated approaches to handle the resulting high‐frequency large‐spatial scale imagery. We developed an automated time‐series remote sensing approach to identifying wading bird nests in the Everglades ecosystem of Florida, USA to explore the feasibility and challenges of conducting time‐series based remote sensing on mobile animals at large spatial scales. We combine a computer vision model for detecting birds in weekly UAS imagery of colonies with biology‐informed algorithmic rules to generate an automated approach that identifies likely nests. Comparing the performance of these automated approaches to human review of the same imagery shows that our primary approach identifies nests with comparable performance to human review, and that a secondary approach designed to find quick‐fail nests resulted in high false‐positive rates. We also assessed the ability of both human review and our primary algorithm to find ground‐verified nests in UAS imagery and again found comparable performance, with the exception of nests that fail quickly. Our results showed that automating nest detection, a key first step toward estimating nest success, is possible in complex environments like the Everglades and we discuss a number of challenges and possible uses for these types of approaches.
more »
« less
Development of a Peripheral-Central Vision System for Small UAS Tracking
With the rapid proliferation of small unmanned aircraft systems (UAS), the risk of mid-air collisions is growing, as is the risk associated with the malicious use of these systems. Airborne Detect-and-Avoid (ABDAA) and counter-UAS technologies have similar sensing requirements to detect and track airborne threats, albeit for different purposes: to avoid a collision or to neutralize a threat, respectively. These systems typically include a variety of sensors, such as electro-optical or infrared (EO/IR) cameras, RADAR, or LiDAR, and they fuse the data from these sensors to detect and track a given threat and to predict its trajectory. Camera imagery can be an effective method for detection as well as for pose estimation and threat classification, though a single camera cannot resolve range to a threat without additional information, such as knowledge of the threat geometry. To support ABDAA and counter-UAS applications, we consider a merger of two image-based sensing methods that mimic human vision: (1) a "peripheral vision" camera (i.e., with a fisheye lens) to provide a large field-of-view and (2) a "central vision" camera (i.e., with a perspective lens) to provide high resolution imagery of a specific target. Beyond the complementary ability of the two cameras to support detection and classification, the pair form a heterogeneous stereo vision system that can support range resolution. This paper describes the initial development and testing of a peripheral-central vision system to detect, localize, and classify an airborne threat and finally to predict its path using knowledge of the threat class.
more »
« less
- Award ID(s):
- 1650465
- PAR ID:
- 10086398
- Date Published:
- Journal Name:
- AIAA SciTech
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Wireless security cameras are integral components of security systems used by military installations, corporations, and, due to their increased affordability, many private homes. These cameras commonly employ motion sensors to identify that something is occurring in their fields of vision before starting to record and notifying the property owner of the activity. In this paper, we discover that the motion sensing action can disclose the location of the camera through a novel wireless camera localization technique we call MotionCompass. In short, a user who aims to avoid surveillance can find a hidden camera by creating motion stimuli and sniffing wireless traffic for a response to that stimuli. With the motion trajectories within the motion detection zone, the exact location of the camera can be then computed. We develop an Android app to implement MotionCompass. Our extensive experiments using the developed app and 18 popular wireless security cameras demonstrate that for cameras with one motion sensor, MotionCompass can attain a mean localization error of around 5 cm with less than 140 seconds. This localization technique builds upon existing work that detects the existence of hidden cameras, to pinpoint their exact location and area of surveillance.more » « less
-
This paper proposes a system architecture for tracking multiple ground-based objects using a team of unmanned air systems (UAS). In the architecture pipeline, video data is processed by each UAS to detect motion in the image frame. The ground-based location of the detected motion is estimated using a geolocation algorithm. The subsequent data points are then process by the recently introduced Recursive RANSAC (R-RANSASC) algorithm to produce a set of tracks. These tracks are then communicated over the network and the error in the coordinate frames between vehicles must be estimated. After the tracks have been placed in the same coordinate frame, a track-to-track association algorithm is used to determine which tracks in each camera correspond to tracks in other cameras. Associated tracks are then fused using a distributed information filter. The proposed method is demonstrated on data collected from two multi-rotors tracking a person walking on the ground.more » « less
-
Intelligent systems commonly employ vision sensors like cameras to analyze a scene. Recent work has proposed a wireless sensing technique, wireless vibrometry, to enrich the scene analysis generated by vision sensors. Wireless vibrometry employs wireless signals to sense subtle vibrations from the objects and infer their internal states. However, it is difficult for pure Radio-Frequency (RF) sensing systems to obtain objects' visual appearances (e.g., object types and locations), especially when an object is inactive. Thus, most existing wireless vibrometry systems assume that the number and the types of objects in the scene are known. The key to getting rid of these presumptions is to build a connection between wireless sensor time series and vision sensor images. We present Capricorn, a vision-guided wireless vibrometry system. In Capricorn, the object type information from vision sensors guides the wireless vibrometry system to select the most appropriate signal processing pipeline. The object tracking capability in computer vision also helps wireless systems efficiently detect and separate vibrations from multiple objects in real time.more » « less
-
Unoccupied aerial systems (UAS) are an established technique for collecting data on cold region phenomenon at high spatial and temporal resolutions. While many studies have focused on remote sensing applications for monitoring long term changes in cold regions, the role of UAS for detection, monitoring, and response to rapid changes and direct exposures resulting from abrupt hazards in cold regions is in its early days. This review discusses recent applications of UAS remote sensing platforms and sensors, with a focus on observation techniques rather than post-processing approaches, for abrupt, cold region hazards including permafrost collapse and event-based thaw, flooding, snow avalanches, winter storms, erosion, and ice jams. The pilot efforts highlighted in this review demonstrate the potential capacity for UAS remote sensing to complement existing data acquisition techniques for cold region hazards. In many cases, UASs were used alongside other remote sensing techniques (e.g., satellite, airborne, terrestrial) andin situsampling to supplement existing data or to collect additional types of data not included in existing datasets (e.g., thermal, meteorological). While the majority of UAS applications involved creation of digital elevation models or digital surface models using Structure-from-Motion (SfM) photogrammetry, this review describes other applications of UAS observations that help to assess risks, identify impacts, and enhance decision making. As the frequency and intensity of abrupt cold region hazards changes, it will become increasingly important to document and understand these changes to support scientific advances and hazard management. The decreasing cost and increasing accessibility of UAS technologies will create more opportunities to leverage these techniques to address current research gaps. Overcoming challenges related to implementation of new technologies, modifying operational restrictions, bridging gaps between data types and resolutions, and creating data tailored to risk communication and damage assessments will increase the potential for UAS applications to improve the understanding of risks and to reduce those risks associated with abrupt cold region hazards. In the future, cold region applications can benefit from the advances made by these early adopters who have identified exciting new avenues for advancing hazard research via innovative use of both emerging and existing sensors.more » « less
An official website of the United States government

