skip to main content


Search for: All records

Award ID contains: 1844280

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Imaging underwater environments is of great importance to marine sciences, sustainability, climatology, defense, robotics, geology, space exploration, and food security. Despite advances in underwater imaging, most of the ocean and marine organisms remain unobserved and undiscovered. Existing methods for underwater imaging are unsuitable for scalable, long-term, in situ observations because they require tethering for power and communication. Here we describe underwater backscatter imaging, a method for scalable, real-time wireless imaging of underwater environments using fully-submerged battery-free cameras. The cameras power up from harvested acoustic energy, capture color images using ultra-low-power active illumination and a monochrome image sensor, and communicate wirelessly at net-zero-power via acoustic backscatter. We demonstrate wireless battery-free imaging of animals, plants, pollutants, and localization tags in enclosed and open-water environments. The method’s self-sustaining nature makes it desirable for massive, continuous, and long-term ocean deployments with many applications including marine life discovery, submarine surveillance, and underwater climate change monitoring.

     
    more » « less
  2. The past few years have witnessed a growing interest in wireless and batteryless implants, due to their potential in long-term biomedical monitoring of in-body conditions such as internal organ movements, bladder pressure, and gastrointestinal health. Early proposals for batteryless implants relied on inductive near-field coupling and ultrasound harvesting, which require direct contact between the external power source and the human body. To overcome this near-field challenge, recent research has investigated the use of RF backscatter in wireless micro-implants because of its ability to communicate with wireless receivers that are placed at a distance outside the body (∼0.5 m), allowing a more seamless user experience. Unfortunately, existing far-field backscatter designs remain limited in their functionality: they cannot perform biometric sensing or secure data transmission; they also suffer from degraded harvesting efficiency and backscatter range due to the impact of variations in the surrounding tissues. In this paper, we present the design of a batteryless, wireless and secure system-on-chip (SoC) implant for in-body strain sensing. The SoC relies on four features: 1) employing a reconfigurable in-body rectenna which can operate across tissues adapting its backscatter bandwidth and center frequency; 2) designing an energy efficient 1.37 mmHg strain sensing front-end with an efficiency of 5.9 mmHg·nJ/conversion; 3) incorporating an AES-GCM security engine to ensure the authenticity and confidentiality of sensed data while sharing the ADC with the sensor interface for an area efficient random number generation; 4) implementing an over-the-air closed-loop wireless programming scheme to reprogram the RF front-end to adapt for surrounding tissues and the sensor front-end to achieve faster settling times below 2 s. 
    more » « less
  3. Mechanical search is a robotic problem where a robot needs to retrieve a target item that is partially or fully occluded from its camera. State-of-the-art approaches for mechanical search either require an expensive search process to find the target item, or they require the item to be tagged with a radio frequency identification tag (e.g., RFID), making their approach beneficial only to tagged items in the environment. We present FuseBot, the first robotic system for RF-Visual mechanical search that enables efficient retrieval of both RFtagged and untagged items in a pile. Rather than requiring all target items in a pile to be RF-tagged, FuseBot leverages the mere existence of an RF-tagged item in the pile to benefit both tagged and untagged items. Our design introduces two key innovations. The first is RF-Visual Mapping, a technique that identifies and locates RF-tagged items in a pile and uses this information to construct an RF-Visual occupancy distribution map. The second is RF-Visual Extraction, a policy formulated as an optimization problem that minimizes the number of actions required to extract the target object by accounting for the probabilistic occupancy distribution, the expected grasp quality, and the expected information gain from future actions. We built a real-time end-to-end prototype of our system on a UR5e robotic arm with in-hand vision and RF perception modules. We conducted over 180 real-world experimental trials to evaluate FuseBot and compare its performance to a of-the-art vision-based system named X-Ray. Our experimental results demonstrate that FuseBot outperforms X-Ray’s efficiency by more than 40% in terms of the number of actions required for successful mechanical search. Furthermore, in comparison to X-Ray’s success rate of 84%, FuseBot achieves a success rate of 95% in retrieving untagged items, demonstrating for the first time that the benefits of RF perception extend beyond tagged objects in the mechanical search problem. 
    more » « less
  4. There is a growing interest in wireless and batteryless implants for long-term sensing of organ movements, core pressure, glucose levels, or other biometrics [1]. Most research on such implants has focused on ultrasonic [2] and nearfield inductive [3-4] methods for power and communication, which require direct contact or close proximity (<1-5cm) to the human body. Recently, RF backscatter has emerged as a promising alternative due to its ability to communicate with far-field (> 10cm) wireless devices at ultra-low-power [5]. While multiple proposals have demonstrated far-field RF backscatter in deep tissues, these proposals have been limited to tag identification and could neither perform biometric sensing nor secure the wireless communication links, which is critical for ensuring the confidentiality of the sensed biometrics and for responding to commands only from authorized users [6]. Moreover, such far-field RF implants are susceptible to tissue variations which impact their resonance and hence their efficiency in RF backscatter and energy harvesting. 
    more » « less
  5. We present the design, implementation, and evaluation of RFusion, a robotic system that can search for and retrieve RFID-tagged items in line-of-sight, non-line-of-sight, and fully-occluded settings. RFusion consists of a robotic arm that has a camera and antenna strapped around its gripper. Our design introduces two key innovations: the first is a method that geometrically fuses RF and visual information to reduce uncertainty about the target object's location, even when the item is fully occluded. The second is a novel reinforcement-learning network that uses the fused RF-visual information to efficiently localize, maneuver toward, and grasp target items. We built an end-to-end prototype of RFusion and tested it in challenging real-world environments. Our evaluation demonstrates that RFusion localizes target items with centimeter-scale accuracy and achieves 96% success rate in retrieving fully occluded objects, even if they are under a pile. The system paves the way for novel robotic retrieval tasks in complex environments such as warehouses, manufacturing plants, and smart homes. 
    more » « less
  6. Stress plays a critical role in our lives, impacting our productivity and our long-term physiological and psychological well-being. This has motivated the development of stress monitoring solutions to better understand stress, its impact on productivity and teamwork, and help users adapt their habits toward more sustainable stress levels. However, today's stress monitoring solutions remain obtrusive, requiring active user participation (e.g., self-reporting), interfering with people's daily activities, and often adding more burden to users looking to reduce their stress. In this paper, we introduce WiStress, the first system that can passively monitor a user's stress levels by relying on wireless signals. WiStress does not require users to actively provide input or to wear any devices on their bodies. It operates by transmitting ultra-low-power wireless signals and measuring their reflections off the user's body. WiStress introduces two key innovations. First, it presents the first machine learning network that can accurately and robustly extract heartbeat intervals (IBI's) from wireless reflections without constraints on a user's daily activities. Second, it introduces a stress classification framework that combines the extracted heartbeats with other wirelessly captured stress-related features in order to infer a subject's stress level. We built a prototype of WiStress and tested it on 22 different subjects across different environments in both stress-induced and free-living conditions. Our results demonstrate that WiStress has high accuracy (84%-95%) in inferring a person's stress level in a fully-automated way, paving the way for ubiquitous sensing systems that can monitor stress and provide feedback to improve productivity, health, and well-being. 
    more » « less
  7. We present the design, implementation, and evaluation of RF-Grasp, a robotic system that can grasp fully-occluded objects in unknown and unstructured environments. Unlike prior systems that are constrained by the line-of-sight perception of vision and infrared sensors, RF-Grasp employs RF (Radio Frequency) perception to identify and locate target objects through occlusions, and perform efficient exploration and complex manipulation tasks in non-line-of-sight settings.RF-Grasp relies on an eye-in-hand camera and batteryless RFID tags attached to objects of interest. It introduces two main innovations: (1) an RF-visual servoing controller that uses the RFID’s location to selectively explore the environment and plan an efficient trajectory toward an occluded target, and (2) an RF-visual deep reinforcement learning network that can learn and execute efficient, complex policies for decluttering and grasping.We implemented and evaluated an end-to-end physical prototype of RF-Grasp. We demonstrate it improves success rate and efficiency by up to 40-50% over a state-of-the-art baseline. We also demonstrate RF-Grasp in novel tasks such mechanical search of fully-occluded objects behind obstacles, opening up new possibilities for robotic manipulation. Qualitative results (videos) available at rfgrasp.media.mit.edu 
    more » « less
  8. null (Ed.)
  9. null (Ed.)