skip to main content


Title: REAM: Resource Efficient Adaptive Monitoring of Community Spaces at the Edge Using Reinforcement Learning
An increasing number of community spaces are being instrumented with heterogeneous IoT sensors and actuators that enable continuous monitoring of the surrounding environments. Data streams generated from the devices are analyzed using a range of analytics operators and transformed into meaningful information for community monitoring applications. To ensure high quality results, timely monitoring, and application reliability, we argue that these operators must be hosted at edge servers located in close proximity to the community space. In this paper, we present a Resource Efficient Adaptive Monitoring (REAM) framework at the edge that adaptively selects workflows of devices and operators to maintain adequate quality of information for the application at hand while judiciously consuming the limited resources available on edge servers. IoT deployments in community spaces are in a state of continuous flux that are dictated by the nature of activities and events within the space. Since these spaces are complex and change dynamically, and events can take place under different environmental contexts, developing a one-size-fits-all model that works for all types of spaces is infeasible. The REAM framework utilizes deep reinforcement learning agents that learn by interacting with each individual community spaces and take decisions based on the state of the environment in each space and other contextual information. We evaluate our framework on two real-world testbeds in Orange County, USA and NTHU, Taiwan. The evaluation results show that community spaces using REAM can achieve > 90% monitoring accuracy while incurring ~ 50% less resource consumption costs compared to existing static monitoring and Machine Learning driven approaches.  more » « less
Award ID(s):
1952247
NSF-PAR ID:
10311281
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2020 IEEE International Conference on Smart Computing (SMARTCOMP)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Mobile devices such as drones and autonomous vehicles increasingly rely on object detection (OD) through deep neural networks (DNNs) to perform critical tasks such as navigation, target-tracking and surveillance, just to name a few. Due to their high complexity, the execution of these DNNs requires excessive time and energy. Low-complexity object tracking (OT) is thus used along with OD, where the latter is periodically applied to generate "fresh" references for tracking. However, the frames processed with OD incur large delays, which does not comply with real-time applications requirements. Offloading OD to edge servers can mitigate this issue, but existing work focuses on the optimization of the offloading process in systems where the wireless channel has a very large capacity. Herein, we consider systems with constrained and erratic channel capacity, and establish parallel OT (at the mobile device) and OD (at the edge server) processes that are resilient to large OD latency. We propose Katch-Up, a novel tracking mechanism that improves the system resilience to excessive OD delay. We show that this technique greatly improves the quality of the reference available to tracking, and boosts performance up to 33%. However, while Katch-Up significantly improves performance, it also increases the computing load of the mobile device. Hence, we design SmartDet, a low-complexity controller based on deep reinforcement learning (DRL) that learns to achieve the right trade-off between resource utilization and OD performance. SmartDet takes as input highly-heterogeneous context-related information related to the current video content and the current network conditions to optimize frequency and type of OD offloading, as well as Katch-Up utilization. We extensively evaluate SmartDet on a real-world testbed composed by a JetSon Nano as mobile device and a GTX 980 Ti as edge server, connected through a Wi-Fi link, to collect several network-related traces, as well as energy measurements. We consider a state-of-the-art video dataset (ILSVRC 2015 - VID) and state-of-the-art OD models (EfficientDet 0, 2 and 4). Experimental results show that SmartDet achieves an optimal balance between tracking performance – mean Average Recall (mAR) and resource usage. With respect to a baseline with full Katch-Up usage and maximum channel usage, we still increase mAR by 4% while using 50% less of the channel and 30% power resources associated with Katch-Up. With respect to a fixed strategy using minimal resources, we increase mAR by 20% while using Katch-Up on 1/3 of the frames. 
    more » « less
  2. Deep neural networks (DNNs) are being applied to various areas such as computer vision, autonomous vehicles, and healthcare, etc. However, DNNs are notorious for their high computational complexity and cannot be executed efficiently on resource constrained Internet of Things (IoT) devices. Various solutions have been proposed to handle the high computational complexity of DNNs. Offloading computing tasks of DNNs from IoT devices to cloud/edge servers is one of the most popular and promising solutions. While such remote DNN services provided by servers largely reduce computing tasks on IoT devices, it is challenging for IoT devices to inspect whether the quality of the service meets their service level objectives (SLO) or not. In this paper, we address this problem and propose a novel approach named QIS (quality inspection sampling) that can efficiently inspect the quality of the remote DNN services for IoT devices. To realize QIS, we design a new ID-generation method to generate data (IDs) that can identify the serving DNN models on edge servers. QIS inserts the IDs into the input data stream and implements sampling inspection on SLO violations. The experiment results show that the QIS approach can reliably inspect, with a nearly 100% success rate, the service qualtiy of remote DNN services when the SLA level is 99.9% or lower at the cost of only up to 0.5% overhead. 
    more » « less
  3. Internet of Things (IoT) is a connected network of devices that exchange data using different protocols. The application of IoT ranges from intelligent TVs and intelligent Refrigerators to smart Transportation. This research aims to provide students with hands-on training on how to develop an IoT platform that supports device management, connectivity, and data management. People tend to build interconnected devices without having a basic understanding of how the IoT platform backend function. Studying the Arm Pelion will help to understand how IoT devices operate under the hood. This past summer, Morgan State University has hosted undergraduate engineering students and high school STEM teachers to conduct IoT security research in the Cybersecurity Assurance & Policy (CAP) Center. The research project involved integrating various hardware sensor devices and real-time data monitoring using the Arm Pelion IoT development platform. Some of the student/teacher outcomes from the project include: 1) Learning about IoT Technology and security; 2) Programming an embedded system using Arm Mbed development board and IDE; 3 3) Developing a network of connected IoT devices using different protocols such as LWM2M, MQTT, CoAP; 4) Investigating the cybersecurity risks associated with the platform; and 5) Using data analysis and visualization to understand the network data and packet flow. First, the student/teacher must consider the IoT framework to understand how to address the security. The IoT framework describes the essential functions of an IoT network, breaking it down into separate layers. These layers include an application layer, middleware layer, and connectivity layer. The application layer allows the users to access the platform via a smartphone or any other dashboard. The Middleware layer represents the backend system that provides edge devices with data management, messaging, application services, and authentication. Finally, the connectivity layer includes devices that connect the user to the network, including Bluetooth or WiFi. The platform consists of several commercial IoT devices such as a smart camera, baby monitor, smart light, and other devices. We then create algorithms to classify the network data flow; to visualize the packets flow in the network and the structure of the packets data frame over time. 
    more » « less
  4. Internet of Things (IoT) and Storage-as-a-Service (STaaS) continuum permit cost-effective maintenance of security-sensitive information collected by IoT devices over cloud systems. It is necessary to guarantee the security of sensitive data in IoT-STaaS applications. Especially, log entries trace critical events in computer systems and play a vital role in the trustworthiness of IoT-STaaS. An ideal log protection tool must be scalable and lightweight for vast quantities of resource-limited IoT devices while permitting efficient and public verification at STaaS. However, the existing cryptographic logging schemes either incur significant computation/signature overhead to the logger or extreme storage and verification costs to the cloud. There is a critical need for a cryptographic forensic log tool that respects the efficiency requirements of the IoT-STaaS continuum. In this paper, we created novel digital signatures for logs called Optimal Signatures for secure Logging (OSLO), which are the first (to the best of our knowledge) to offer both small-constant signature and public key sizes with near-optimal signing and batch verification via various granularities. We introduce new design features such as one-time randomness management, flexible aggregation along with various optimizations to attain these seemingly conflicting properties simultaneously. Our experiments show that OSLO offers 50× faster verification (for 235 entries) than the most compact alternative with equal signature sizes, while also being several magnitudes of more compact than its most logger efficient counterparts. These properties make OSLO an ideal choice for the IoT-STaaS, wherein lightweight logging and efficient batch verification of massive-size logs are vital for the IoT edge and cold storage servers, respectively. 
    more » « less
  5. null (Ed.)
    Edge intelligence (EI) has received a lot of interest because it can reduce latency, increase efficiency, and preserve privacy. More significantly, as the Internet of Things (IoT) has proliferated, billions of portable and embedded devices have been interconnected, producing zillions of gigabytes on edge networks. Thus, there is an immediate need to push AI (artificial intelligence) breakthroughs within edge networks to achieve the full promise of edge data analytics. EI solutions have supported digital technology workloads and applications from the infrastructure level to edge networks; however, there are still many challenges with the heterogeneity of computational capabilities and the spread of information sources. We propose a novel event-driven deep-learning framework, called EDL-EI (event-driven deep learning for edge intelligence), via the design of a novel event model by defining events using correlation analysis with multiple sensors in real-world settings and incorporating multi-sensor fusion techniques, a transformation method for sensor streams into images, and lightweight 2-dimensional convolutional neural network (CNN) models. To demonstrate the feasibility of the EDL-EI framework, we presented an IoT-based prototype system that we developed with multiple sensors and edge devices. To verify the proposed framework, we have a case study of air-quality scenarios based on the benchmark data provided by the USA Environmental Protection Agency for the most polluted cities in South Korea and China. We have obtained outstanding predictive accuracy (97.65% and 97.19%) from two deep-learning models on the cities’ air-quality patterns. Furthermore, the air-quality changes from 2019 to 2020 have been analyzed to check the effects of the COVID-19 pandemic lockdown. 
    more » « less