skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Reservoir: Named Data for Pervasive Computation Reuse at the Network Edge
In edge computing use cases (e.g., smart cities), where several users and devices may be in close proximity to each other, computational tasks with similar input data for the same services (e.g., image or video annotation) may be offloaded to the edge. The execution of such tasks often yields the same results (output) and thus duplicate (redundant) computation. Based on this observation, prior work has advocated for "computation reuse", a paradigm where the results of previously executed tasks are stored at the edge and are reused to satisfy incoming tasks with similar input data, instead of executing these incoming tasks from scratch. However, realizing computation reuse in practical edge computing deployments, where services may be offered by multiple (distributed) edge nodes (servers) for scalability and fault tolerance, is still largely unexplored. To tackle this challenge, in this paper, we present Reservoir, a framework to enable pervasive computation reuse at the edge, while imposing marginal overheads on user devices and the operation of the edge network infrastructure. Reservoir takes advantage of Locality Sensitive Hashing (LSH) and runs on top of Named-Data Networking (NDN), extending the NDN architecture for the realization of the computation reuse semantics in the network. Our evaluation demonstrated that Reservoir can reuse computation with up to an almost perfect accuracy, achieving 4.25-21.34x lower task completion times compared to cases without computation reuse.  more » « less
Award ID(s):
2104700 2306685
PAR ID:
10322104
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Proceedings of the IEEE International Conference on Pervasive Computing and Communications
ISSN:
2474-249X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In edge computing deployments, where devices may be in close proximity to each other, these devices may offload similar computational tasks (i.e., tasks with similar input data for the same edge computing service or for services of the same nature). This results in the execution of duplicate (redundant) computation, which may become a pressing issue for future edge computing environments, since such deployments are envisioned to consist of small-scale data-centers at the edge. To tackle this issue, in this paper, we highlight the importance of paradigms for the deduplication and reuse of computation at the network edge. Such paradigms have the potential to significantly reduce the completion times for offloaded tasks, accommodating more users, devices, and tasks with the same volume of deployed edge computing resources, however, they come with their own technical challenges. Finally, we present a multi-layer architecture to enable computation deduplication and reuse at the network edge and discuss open challenges and future research directions. 
    more » « less
  2. null (Ed.)
    Due to the proliferation of Internet of Things (IoT) and application/user demands that challenge communication and computation, edge computing has emerged as the paradigm to bring computing resources closer to users. In this paper, we present Whispering, an analytical model for the migration of services (service offloading) from the cloud to the edge, in order to minimize the completion time of computational tasks offloaded by user devices and improve the utilization of resources. We also empirically investigate the impact of reusing the results of previously executed tasks for the execution of newly received tasks (computation reuse) and propose an adaptive task offloading scheme between edge and cloud. Our evaluation results show that Whispering achieves up to 35% and 97% (when coupled with computation reuse) lower task completion times than cases where tasks are executed exclusively at the edge or the cloud. 
    more » « less
  3. In an IoP environment, edge computing has been proposed to address the problems of resource limitations of edge devices such as smartphones as well as the high-latency, user privacy exposure and network bottleneck that the cloud computing platform solutions incur. This paper presents a context management framework comprised of sensors, mobile devices such as smartphones and an edge server to enable high performance, context-aware computing at the edge. Key features of this architecture include energy-efficient discovery of available sensors and edge services for the client, an automated mechanism for task planning and execution on the edge server, and a dynamic environment where new sensors and services may be added to the framework. A prototype of this architecture has been implemented, and an experimental evaluation using two computer vision tasks as example services is presented. Performance measurement shows that the execution of the example tasks performs quite well and the proposed framework is well suited for an edge-computing environment. 
    more » « less
  4. Edge computing is an emerging computing paradigm representing decentralized and distributed information technology architecture [1] . The demand for edge computing is primarily driven by the increased number of smart devices and the Internet of Things (IoT) that generate and transmit a substantial amount of data, that would otherwise be stored on cloud computing services. The edge architecture enables data and computation to be performed in close proximity to users and data sources and acts as the pathway toward upstream data centers [2] . Rather than sending data to the cloud for processing, the analysis and work is done closer to where the source of the data is generated ( Figure 1 ). Edge services leverage local infrastructure resources allowing for reduced network latency, improved bandwidth utilization, and better energy efficiency compared to cloud computing. 
    more » « less
  5. Edge computing allows end-user devices to offload heavy computation to nearby edge servers for reduced latency, maximized profit, and/or minimized energy consumption. Data-dependent tasks that analyze locally-acquired sensing data are one of the most common candidates for task offloading in edge computing. As a result, the total latency and network load are affected by the total amount of data transferred from end-user devices to the selected edge servers. Most existing solutions for task allocation in edge computing do not take into consideration that some user tasks may actually operate on the same data items. Making the task allocation algorithm aware of the existing data sharing characteristics of tasks can help reduce network load at a negligible profit loss by allocating more tasks sharing data on the same server. In this paper, we formulate the data sharing-aware task allocation problem that make decisions on task allocation for maximized profit and minimized network load by taking into account the data-sharing characteristics of tasks. In addition, because the problem is NP-hard, we design the DSTA algorithm, which finds a solution to the problem in polynomial time. We analyze the performance of the proposed algorithm against a state-of-the-art baseline that only maximizes profit. Our extensive analysis shows that DSTA leads to about 8 times lower data load on the network while being within 1.03 times of the total profit on average compared to the state-of-the-art. 
    more » « less