skip to main content


Search for: All records

Creators/Authors contains: "Kochersberger, Kevin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. With the rapid proliferation of small unmanned aircraft systems (UAS), the risk of mid-air collisions is growing, as is the risk associated with the malicious use of these systems. Airborne Detect-and-Avoid (ABDAA) and counter-UAS technologies have similar sensing requirements to detect and track airborne threats, albeit for different purposes: to avoid a collision or to neutralize a threat, respectively. These systems typically include a variety of sensors, such as electro-optical or infrared (EO/IR) cameras, RADAR, or LiDAR, and they fuse the data from these sensors to detect and track a given threat and to predict its trajectory. Camera imagery can be an effective method for detection as well as for pose estimation and threat classification, though a single camera cannot resolve range to a threat without additional information, such as knowledge of the threat geometry. To support ABDAA and counter-UAS applications, we consider a merger of two image-based sensing methods that mimic human vision: (1) a "peripheral vision" camera (i.e., with a fisheye lens) to provide a large field-of-view and (2) a "central vision" camera (i.e., with a perspective lens) to provide high resolution imagery of a specific target. Beyond the complementary ability of the two cameras to support detection and classification, the pair form a heterogeneous stereo vision system that can support range resolution. This paper describes the initial development and testing of a peripheral-central vision system to detect, localize, and classify an airborne threat and finally to predict its path using knowledge of the threat class. 
    more » « less
  2. Unmanned vehicles, equipped with radiation detection sensors, can serve as a valuable aid to personnel responding to radiological incidents. The use of tele-operated ground vehicles avoids human exposure to hazardous environments, which in addition to radioactive contamination, might present other risks to personnel. Autonomous unmanned vehicles using algorithms for radioisotope classification, source localization, and efficient exploration allow these vehicles to conduct surveys with reduced human supervision allowing teams to address larger areas in less time. This work presents systems for autonomous radiation search with results presented in several proof-of-concept demonstrations. 
    more » « less
  3. Abstract

    This paper discusses the results of a field experiment conducted at Savannah River National Laboratory to test the performance of several algorithms for the localization of radioactive materials. In this multirobot system, both an unmanned aerial vehicle, a custom hexacopter, and an unmanned ground vehicle (UGV), the ClearPath Jackal, equipped withγ‐ray spectrometers, were used to collect data from two radioactive source configurations. Both the Fourier scattering transform and the Laplacian eigenmap algorithms for source detection were tested on the collected data sets. These algorithms transform raw spectral measurements into alternate spaces to allow clustering to detect trends within the data which indicate the presence of radioactive sources. This study also presents a point source model and accompanying information‐theoretic active exploration algorithm. Field testing validated the ability of this model to fuse aerial and ground collected radiation measurements, and the exploration algorithm’s ability to select informative actions to reduce model uncertainty, allowing the UGV to locate radioactive material online.

     
    more » « less