skip to main content

Title: Wireless steerable vision for live insects and insect-scale robots

Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery. We also built a more » small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.

« less
 ;  ;  ;  ;  
Publication Date:
Journal Name:
Science Robotics
Page Range or eLocation-ID:
Article No. eabb0839
American Association for the Advancement of Science (AAAS)
Sponsoring Org:
National Science Foundation
More Like this
  1. We present the first system that can airdrop wireless sensors from small drones and live insects. In addition to the challenges of achieving low-power consumption and long-range communication, airdropping wireless sensors is difficult because it requires the sensor to survive the impact when dropped in mid-air. Our design takes inspiration from nature: small insects like ants can fall from tall buildings and survive because of their tiny mass and size. Inspired by this, we design insect-scale wireless sensors that come fully integrated with an onboard power supply and a lightweight mechanical actuator to detach from the aerial platform. Our system introduces a first-of-its-kind 37 mg mechanical release mechanism to drop the sensor during flight, using only 450 μJ of energy as well as a wireless communication link that can transmit sensor data at 33 kbps up to 1 km. Once deployed, our 98 mg wireless sensor can run for 1.3-2.5 years when transmitting 10-50 packets per hour on a 68 mg battery. We demonstrate attachment to a small 28 mm wide drone and a moth (Manduca sexta) and show that our insect-scale sensors flutter as they fall, suffering no damage on impact onto a tile floor from heights of 22more »m.« less
  2. Behavioral measurements of fragile aquatic organisms require specialized in situ techniques.We developed an in situ brightfield camera set-up for use during SCUBA diving in aquatic ecosystems.The system uses brightfield illumination with collimated light and an underwater camera to highlight morphological details, body motion and interactions between organisms with high spatial (4K: 3840x2160 pixels) and temporal resolution (up to 120 fps).This technique is particularly useful for gelatinous organisms because of their large (centimeters in length), transparent bodies.Further, the measurements are not subject to experimental artifacts produced in laboratory studies.This method is useful for anyone seeking detailed brightfield images of organisms or nonliving material (e.g. marine snow) in the natural environment.
  3. Abstract Insects are highly capable walkers, but many questions remain regarding how the insect nervous system controls locomotion. One particular question is how information is communicated between the ‘lower level’ ventral nerve cord (VNC) and the ‘higher level’ head ganglia to facilitate control. In this work, we seek to explore this question by investigating how systems traditionally described as ‘positive feedback’ may initiate and maintain stepping in the VNC with limited information exchanged between lower and higher level centers. We focus on the ‘reflex reversal’ of the stick insect femur-tibia joint between a resistance reflex (RR) and an active reaction in response to joint flexion, as well as the activation of populations of descending dorsal median unpaired (desDUM) neurons from limb strain as our primary reflex loops. We present the development of a neuromechanical model of the stick insect ( Carausius morosus ) femur-tibia (FTi) and coxa-trochanter joint control networks ‘in-the-loop’ with a physical robotic limb. The control network generates motor commands for the robotic limb, whose motion and forces generate sensory feedback for the network. We based our network architecture on the anatomy of the non-spiking interneuron joint control network that controls the FTi joint, extrapolated network connectivity based on known muscle responses,more »and previously developed mechanisms to produce ‘sideways stepping’. Previous studies hypothesized that RR is enacted by selective inhibition of sensory afferents from the femoral chordotonal organ, but no study has tested this hypothesis with a model of an intact limb. We found that inhibiting the network’s flexion position and velocity afferents generated a reflex reversal in the robot limb’s FTi joint. We also explored the intact network’s ability to sustain steady locomotion on our test limb. Our results suggested that the reflex reversal and limb strain reinforcement mechanisms are both necessary but individually insufficient to produce and maintain rhythmic stepping in the limb, which can be initiated or halted by brief, transient descending signals. Removing portions of this feedback loop or creating a large enough disruption can halt stepping independent of the higher-level centers. We conclude by discussing why the nervous system might control motor output in this manner, as well as how to apply these findings to generalized nervous system understanding and improved robotic control.« less
  4. Abstract Vision is underpinned by phototransduction, a signaling cascade that converts light energy into an electrical signal. Among insects, phototransduction is best understood in Drosophila melanogaster. Comparison of D. melanogaster against three insect species found several phototransduction gene gains and losses, however, lepidopterans were not examined. Diurnal butterflies and nocturnal moths occupy different light environments and have distinct eye morphologies, which might impact the expression of their phototransduction genes. Here we investigated: 1) how phototransduction genes vary in gene gain or loss between D. melanogaster and Lepidoptera, and 2) variations in phototransduction genes between moths and butterflies. To test our prediction of phototransduction differences due to distinct visual ecologies, we used insect reference genomes, phylogenetics, and moth and butterfly head RNA-Seq and transcriptome data. As expected, most phototransduction genes were conserved between D. melanogaster and Lepidoptera, with some exceptions. Notably, we found two lepidopteran opsins lacking a D. melanogaster ortholog. Using antibodies we found that one of these opsins, a candidate retinochrome, which we refer to as unclassified opsin (UnRh), is expressed in the crystalline cone cells and the pigment cells of the butterfly, Heliconius melpomene. Our results also show that butterflies express similar amounts of trp and trpl channelmore »mRNAs, whereas moths express ∼50× less trp, a potential adaptation to darkness. Our findings suggest that while many single-copy D. melanogaster phototransduction genes are conserved in lepidopterans, phototransduction gene expression differences exist between moths and butterflies that may be linked to their visual light environment.« less
  5. Compound eyes found in insects provide intriguing sources of biological inspiration for miniaturized imaging systems. Inspired by such insect eye structures, we demonstrate an ultrathin arrayed camera enabled by a flat multi-level diffractive microlens array for super-resolution visible imaging. We experimentally demonstrate that the microlens array can achieve a large fill factor (hexagonal close packing withpitch=120µ<#comment/>m), thickness of 2.6 µm, and diffraction-limited (Strehlratio=0.88) achromatic performance in the visible band (450 to 650 nm). We also demonstrate super-resolution imaging with resolution improvement of∼<#comment/>1.4times by computationally merging 1600 images in the array.