skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 5:00 PM ET until 11:00 PM ET on Friday, June 21 due to maintenance. We apologize for the inconvenience.


Title: Soil Moisture Sensing with UAV-Mounted IR-UWB Radar and Deep Learning

Wide-area soil moisture sensing is a key element for smart irrigation systems. However, existing soil moisture sensing methods usually fail to achieve both satisfactory mobility and high moisture estimation accuracy. In this paper, we present the design and implementation of a novel soil moisture sensing system, named as SoilId, that combines a UAV and a COTS IR-UWB radar for wide-area soil moisture sensing without the need of burying any battery-powered in-ground device. Specifically, we design a series of novel methods to help SoilId extract soil moisture related features from the received radar signals, and automatically detect and discard the data contaminated by the UAV's uncontrollable motion and the multipath interference. Furthermore, we leverage the powerful representation ability of deep neural networks and carefully design a neural network model to accurately map the extracted radar signal features to soil moisture estimations. We have extensively evaluated SoilId against a variety of real-world factors, including the UAV's uncontrollable motion, the multipath interference, soil surface coverages, and many others. Specifically, the experimental results carried out by our UAV-based system validate that SoilId can push the accuracy limits of RF-based soil moisture sensing techniques to a 50% quantile MAE of 0.23%.

 
more » « less
Award ID(s):
2154059
NSF-PAR ID:
10483867
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ;
Publisher / Repository:
ACM
Date Published:
Journal Name:
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume:
7
Issue:
1
ISSN:
2474-9567
Page Range / eLocation ID:
1 to 25
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Passive Remote Sensing services are indispensable in modern society because of the applications related to climate studies and earth science. Among those, NASA’s Soil Moisture Active and Passive (SMAP) mission provides an essential climate variable such as the moisture content of the soil by using microwave radiation within protected band over 1400-1427 MHz. However, because of the increasing active wireless technologies such as Internet of Things (IoT), unmanned aerial vehicles (UAV), and 5G wireless communication, the SMAP’s passive observations are expected to experience an increasing number of Radio Frequency Interference (RFI). RFI is a well-documented issue and SMAP has a ground processing unit dedicated to tackling this issue. However, advanced techniques are needed to tackle the increasing RFI problem for passive sensing systems and to jointly coexist communication and sensing systems. In this paper, we apply a deep learning approach where a novel Convolutional Neural Network (CNN) architecture for both RFI detection and mitigation is employed. SMAP Level 1A spectrogram of antenna counts and various moments data are used as the inputs to the deep learning architecture. We simulate different types of RFI sources such as pulsed, CW or wideband anthropogenic signals. We then use artificially corrupted SMAP Level 1B antenna measurements in conjunction with RFI labels to train the learning architecture. While the learned detection network classifies input spectrograms as RFI or no-RFI cases, the mitigation network reconstructs the RFI mitigated antenna temperature images. The proposed learning framework both takes advantage of the existing SMAP data and the simulated RFI scenarios. Future remote sensing systems such as radiometers will suffer an increasing RFI problem and spectrum sharing and techniques that will allow coexistance of sensing and communication systems will be utmost importance for both parties. RFI detection and mitigation will remain a prerequisite for these radiometers and the proposed deep learning approach has the potential to provide an additional perspective to existing solutions. We will present detailed analysis on the selected deep learning architecture, obtained RFI detection accuracy levels and RFI mitigation performance. 
    more » « less
  2. null (Ed.)
    Human action recognition is an important topic in artificial intelligence with a wide range of applications including surveillance systems, search-and-rescue operations, human-computer interaction, etc. However, most of the current action recognition systems utilize videos captured by stationary cameras. Another emerging technology is the use of unmanned ground and aerial vehicles (UAV/UGV) for different tasks such as transportation, traffic control, border patrolling, wild-life monitoring, etc. This technology has become more popular in recent years due to its affordability, high maneuverability, and limited human interventions. However, there does not exist an efficient action recognition algorithm for UAV-based monitoring platforms. This paper considers UAV-based video action recognition by addressing the key issues of aerial imaging systems such as camera motion and vibration, low resolution, and tiny human size. In particular, we propose an automated deep learning-based action recognition system which includes the three stages of video stabilization using the SURF feature selection and Lucas-Kanade method, human action area detection using faster region-based convolutional neural networks (R-CNN), and action recognition. We propose a novel structure that extends and modifies the InceptionResNet-v2 architecture by combining a 3D CNN architecture and a residual network for action recognition. We achieve an average accuracy of 85.83% for the entire-video-level recognition when applying our algorithm to the popular UCF-ARG aerial imaging dataset. This accuracy significantly improves upon the state-of-the-art accuracy by a margin of 17%. 
    more » « less
  3. Abstract

    Optical frequency combs, featuring evenly spaced spectral lines, have been extensively studied and applied to metrology, signal processing, and sensing. Recently, frequency comb generation has been also extended to MHz frequencies by harnessing nonlinearities in microelectromechanical membranes. However, the generation of frequency combs at radio frequencies (RF) has been less explored, together with their potential application in wireless technologies. In this work, we demonstrate an RF system able to wirelessly and passively generate frequency combs. This circuit, which we name quasi-harmonic tag (qHT), offers a battery-free solution for far-field ranging of unmanned vehicles (UVs) in GPS-denied settings, and it enables a strong immunity to multipath interference, providing better accuracy than other RF approaches to far-field ranging. Here, we discuss the principle of operation, design, implementation, and performance of qHTs used to remotely measure the azimuthal distance of a UV flying in an uncontrolled electromagnetic environment. We show that qHTs can wirelessly generate frequency combs with μWatt-levels of incident power by leveraging the nonlinear interaction between an RF parametric oscillator and a high quality factor piezoelectric microacoustic resonator. Our technique for frequency comb generation opens new avenues for a wide range of RF applications beyond ranging, including timing, computing and sensing.

     
    more » « less
  4. null (Ed.)
    For the controller of wearable lower-limb assistive devices, quantitative understanding of human locomotion serves as the basis for human motion intent recognition and joint-level motion control. Traditionally, the required gait data are obtained in gait research laboratories, utilizing marker-based optical motion capture systems. Despite the high accuracy of measurement, marker-based systems are largely limited to laboratory environments, making it nearly impossible to collect the desired gait data in real-world daily-living scenarios. To address this problem, the authors propose a novel exoskeleton-based gait data collection system, which provides the capability of conducting independent measurement of lower limb movement without the need for stationary instrumentation. The basis of the system is a lightweight exoskeleton with articulated knee and ankle joints. To minimize the interference to a wearer’s natural lower-limb movement, a unique two-degrees-of-freedom joint design is incorporated, integrating a primary degree of freedom for joint motion measurement with a passive degree of freedom to allow natural joint movement and improve the comfort of use. In addition to the joint-embedded goniometers, the exoskeleton also features multiple positions for the mounting of inertia measurement units (IMUs) as well as foot-plate-embedded force sensing resistors to measure the foot plantar pressure. All sensor signals are routed to a microcontroller for data logging and storage. To validate the exoskeleton-provided joint angle measurement, a comparison study on three healthy participants was conducted, which involves locomotion experiments in various modes, including overground walking, treadmill walking, and sit-to-stand and stand-to-sit transitions. Joint angle trajectories measured with an eight-camera motion capture system served as the benchmark for comparison. Experimental results indicate that the exoskeleton-measured joint angle trajectories closely match those obtained through the optical motion capture system in all modes of locomotion (correlation coefficients of 0.97 and 0.96 for knee and ankle measurements, respectively), clearly demonstrating the accuracy and reliability of the proposed gait measurement system. 
    more » « less
  5. The food production system is vulnerable to diseases more than ever, and the threat is increasing in an era of climate change that creates more favorable conditions for emerging diseases. Fortunately, scientists and engineers are making great strides to introduce farming innovations to tackle the challenge. Unmanned aerial vehicle (UAV) remote sensing is among the innovations and thus is widely applied for crop health monitoring and phenotyping. This study demonstrated the versatility of aerial remote sensing in diagnosing yellow rust infection in spring wheats in a timely manner and determining an intervenable period to prevent yield loss. A small UAV equipped with an aerial multispectral sensor periodically flew over, and collected remotely sensed images of, an experimental field in Chacabuco (−34.64; −60.46), Argentina during the 2021 growing season. Post-collection images at the plot level were engaged in a thorough feature-engineering process by handcrafting disease-centric vegetation indices (VIs) from the spectral dimension, and grey-level co-occurrence matrix (GLCM) texture features from the spatial dimension. A machine learning pipeline entailing a support vector machine (SVM), random forest (RF), and multilayer perceptron (MLP) was constructed to identify locations of healthy, mild infection, and severe infection plots in the field. A custom 3-dimensional convolutional neural network (3D-CNN) relying on the feature learning mechanism was an alternative prediction method. The study found red-edge (690–740 nm) and near infrared (NIR) (740–1000 nm) as vital spectral bands for distinguishing healthy and severely infected wheats. The carotenoid reflectance index 2 (CRI2), soil-adjusted vegetation index 2 (SAVI2), and GLCM contrast texture at an optimal distance d = 5 and angular direction θ = 135° were the most correlated features. The 3D-CNN-based wheat disease monitoring performed at 60% detection accuracy as early as 40 days after sowing (DAS), when crops were tillering, increasing to 71% and 77% at the later booting and flowering stages (100–120 DAS), and reaching a peak accuracy of 79% for the spectral-spatio-temporal fused data model. The success of early disease diagnosis from low-cost multispectral UAVs not only shed new light on crop breeding and pathology but also aided crop growers by informing them of a prevention period that could potentially preserve 3–7% of the yield at the confidence level of 95%. 
    more » « less