skip to main content


Title: Wireless steerable vision for live insects and insect-scale robots

Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery. We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.

 
more » « less
NSF-PAR ID:
10172074
Author(s) / Creator(s):
 ;  ;  ;  ;  
Publisher / Repository:
American Association for the Advancement of Science (AAAS)
Date Published:
Journal Name:
Science Robotics
Volume:
5
Issue:
44
ISSN:
2470-9476
Page Range / eLocation ID:
Article No. eabb0839
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Evolution has honed predatory skills in the natural world where localizing and intercepting fast-moving prey is required. The current generation of robotic systems mimics these biological systems using deep learning. High-speed processing of the camera frames using convolutional neural networks (CNN) (frame pipeline) on such constrained aerial edge-robots gets resource-limited. Adding more compute resources also eventually limits the throughput at the frame rate of the camera as frame-only traditional systems fail to capture the detailed temporal dynamics of the environment. Bio-inspired event cameras and spiking neural networks (SNN) provide an asynchronous sensor-processor pair (event pipeline) capturing the continuous temporal details of the scene for high-speed but lag in terms of accuracy. In this work, we propose a target localization system combining event-camera and SNN-based high-speed target estimation and frame-based camera and CNN-driven reliable object detection by fusing complementary spatio-temporal prowess of event and frame pipelines. One of our main contributions involves the design of an SNN filter that borrows from the neural mechanism for ego-motion cancelation in houseflies. It fuses the vestibular sensors with the vision to cancel the activity corresponding to the predator's self-motion. We also integrate the neuro-inspired multi-pipeline processing with task-optimized multi-neuronal pathway structure in primates and insects. The system is validated to outperform CNN-only processing using prey-predator drone simulations in realistic 3D virtual environments. The system is then demonstrated in a real-world multi-drone set-up with emulated event data. Subsequently, we use recorded actual sensory data from multi-camera and inertial measurement unit (IMU) assembly to show desired working while tolerating the realistic noise in vision and IMU sensors. We analyze the design space to identify optimal parameters for spiking neurons, CNN models, and for checking their effect on the performance metrics of the fused system. Finally, we map the throughput controlling SNN and fusion network on edge-compatible Zynq-7000 FPGA to show a potential 264 outputs per second even at constrained resource availability. This work may open new research directions by coupling multiple sensing and processing modalities inspired by discoveries in neuroscience to break fundamental trade-offs in frame-based computer vision 1 . 
    more » « less
  2. Abstract

    Aquatic habitats are closely linked to surrounding terrestrial systems via reciprocal subsidies. Much of the research on aquatic–terrestrial subsidies has focused on streams and lakes, while subsidies across aquatic–terrestrial boundaries of other systems, like temporary ponds, have received less attention. To address the lack of information regarding cross‐habitat subsidies of temporary ponds, we quantified leaf litter inputs, amphibian egg inputs, terrestrial insect inputs, and amphibian metamorph and aquatic insect emergence for eight temporary ponds. We compared the relative magnitude of cross‐habitat biotic subsidies of temporary ponds to identify potentially important yet overlooked subsidies. Terrestrial insect inputs to ponds were the second largest subsidy (mean 15.3 g m−2 yr−1), exceeding combined emergence of amphibians and aquatic insects (mean 4.0 g m−2 yr−1), yet these high‐quality subsidies are generally unaccounted for in similar studies. Across the wetland complex, total amphibian emergence (8929.3 g yr−1) was nearly four times higher than total aquatic insect emergence (2491.9 g yr−1). Aquatic insect emergence was similar to that of lakes and streams while amphibian emergence was generally higher. Although larger ponds produced greater total fluxes to terrestrial habitats, smaller ponds were often more productive per unit area. Therefore, a mosaic of small ponds may produce greater or equivalent subsidies to terrestrial food webs than a single large pond. Given continued threats to temporary ponds and their connections to surrounding forests, management and restoration of these systems, as well as future studies, should take holistic approaches that account for the many aquatic–terrestrial linkages, and factors that influence them.

     
    more » « less
  3. Behavioral measurements of fragile aquatic organisms require specialized in situ techniques.We developed an in situ brightfield camera set-up for use during SCUBA diving in aquatic ecosystems.The system uses brightfield illumination with collimated light and an underwater camera to highlight morphological details, body motion and interactions between organisms with high spatial (4K: 3840x2160 pixels) and temporal resolution (up to 120 fps).This technique is particularly useful for gelatinous organisms because of their large (centimeters in length), transparent bodies.Further, the measurements are not subject to experimental artifacts produced in laboratory studies.This method is useful for anyone seeking detailed brightfield images of organisms or nonliving material (e.g. marine snow) in the natural environment. 
    more » « less
  4. We present the first system that can airdrop wireless sensors from small drones and live insects. In addition to the challenges of achieving low-power consumption and long-range communication, airdropping wireless sensors is difficult because it requires the sensor to survive the impact when dropped in mid-air. Our design takes inspiration from nature: small insects like ants can fall from tall buildings and survive because of their tiny mass and size. Inspired by this, we design insect-scale wireless sensors that come fully integrated with an onboard power supply and a lightweight mechanical actuator to detach from the aerial platform. Our system introduces a first-of-its-kind 37 mg mechanical release mechanism to drop the sensor during flight, using only 450 μJ of energy as well as a wireless communication link that can transmit sensor data at 33 kbps up to 1 km. Once deployed, our 98 mg wireless sensor can run for 1.3-2.5 years when transmitting 10-50 packets per hour on a 68 mg battery. We demonstrate attachment to a small 28 mm wide drone and a moth (Manduca sexta) and show that our insect-scale sensors flutter as they fall, suffering no damage on impact onto a tile floor from heights of 22 m. 
    more » « less
  5. Abstract Insects are highly capable walkers, but many questions remain regarding how the insect nervous system controls locomotion. One particular question is how information is communicated between the ‘lower level’ ventral nerve cord (VNC) and the ‘higher level’ head ganglia to facilitate control. In this work, we seek to explore this question by investigating how systems traditionally described as ‘positive feedback’ may initiate and maintain stepping in the VNC with limited information exchanged between lower and higher level centers. We focus on the ‘reflex reversal’ of the stick insect femur-tibia joint between a resistance reflex (RR) and an active reaction in response to joint flexion, as well as the activation of populations of descending dorsal median unpaired (desDUM) neurons from limb strain as our primary reflex loops. We present the development of a neuromechanical model of the stick insect ( Carausius morosus ) femur-tibia (FTi) and coxa-trochanter joint control networks ‘in-the-loop’ with a physical robotic limb. The control network generates motor commands for the robotic limb, whose motion and forces generate sensory feedback for the network. We based our network architecture on the anatomy of the non-spiking interneuron joint control network that controls the FTi joint, extrapolated network connectivity based on known muscle responses, and previously developed mechanisms to produce ‘sideways stepping’. Previous studies hypothesized that RR is enacted by selective inhibition of sensory afferents from the femoral chordotonal organ, but no study has tested this hypothesis with a model of an intact limb. We found that inhibiting the network’s flexion position and velocity afferents generated a reflex reversal in the robot limb’s FTi joint. We also explored the intact network’s ability to sustain steady locomotion on our test limb. Our results suggested that the reflex reversal and limb strain reinforcement mechanisms are both necessary but individually insufficient to produce and maintain rhythmic stepping in the limb, which can be initiated or halted by brief, transient descending signals. Removing portions of this feedback loop or creating a large enough disruption can halt stepping independent of the higher-level centers. We conclude by discussing why the nervous system might control motor output in this manner, as well as how to apply these findings to generalized nervous system understanding and improved robotic control. 
    more » « less