skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Divided at the Edge - Measuring Performance and the Digital Divide of Cloud Edge Data Centers
Cloud providers are highly incentivized to reduce latency. One way they do this is by locating data centers as close to users as possible. These “cloud edge” data centers are placed in metropolitan areas and enable edge computing for residents of these cities. Therefore, which cities are selected to host edge data centers determines who has the fastest access to applications requiring edge compute — creating a digital divide between those closest and furthest from the edge. In this study we measure latency to the current and predicted cloud edge of three major cloud providers around the world. Our measurements use the RIPE Atlas platform targeting cloud regions, AWS Local Zones, and network optimization services that minimize the path to the cloud edge. An analysis of the digital divide shows rising inequality as the relative difference between users closest and farthest from cloud compute increases. We also find this inequality unfairly affects lower income census tracts in the US. This result is extended globally using remotely sensed night time lights as a proxy for wealth. Finally, we demonstrate that low earth orbit satellite internet can help to close this digital divide and provide more fair access to the cloud edge.  more » « less
Award ID(s):
2106797
PAR ID:
10548867
Author(s) / Creator(s):
;
Publisher / Repository:
ACM CoNext
Date Published:
Journal Name:
Proceedings of the ACM on Networking
Volume:
1
Issue:
CoNEXT3
ISSN:
2834-5509
Page Range / eLocation ID:
1 to 23
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Edge computing has emerged as a popular paradigm for running latency-sensitive applications due to its ability to offer lower network latencies to end-users. In this paper, we argue that despite its lower network latency, the resource-constrained nature of the edge can result in higher end-to-end latency, especially at higher utilizations, when compared to cloud data centers. We study this edge performance inversion problem through an analytic comparison of edge and cloud latencies and analyze conditions under which the edge can yield worse performance than the cloud. To verify our analytic results, we conduct a detailed experimental comparison of the edge and the cloud latencies using a realistic application and real cloud workloads. Both our analytical and experimental results show that even at moderate utilizations, the edge queuing delays can offset the benefits of lower network latencies, and even result in performance inversion where running in the cloud would provide superior latencies. We finally discuss practical implications of our results and provide insights into how application designers and service providers should design edge applications and systems to avoid these pitfalls. 
    more » « less
  2. Next-generation distributed computing networks (e.g., edge and fog computing) enable the efficient delivery of delay-sensitive, compute-intensive applications by facilitating access to computation resources in close proximity to end users. Many of these applications (e.g., augmented/virtual reality) are also data-intensive: in addition to user-specific (live) data streams, they require access to shared (static) digital objects (e.g., im-age database) to complete the required processing tasks. When required objects are not available at the servers hosting the associated service functions, they must be fetched from other edge locations, incurring additional communication cost and latency. In such settings, overall service delivery performance shall benefit from jointly optimized decisions around (i) routing paths and processing locations for live data streams, together with (ii) cache selection and distribution paths for associated digital objects. In this paper, we address the problem of dynamic control of data-intensive services over edge cloud networks. We characterize the network stability region and design the first throughput-optimal control policy that coordinates processing and routing decisions for both live and static data-streams. Numerical results demonstrate the superior performance (e.g., throughput, delay, and resource consumption) obtained via the novel multi-pipeline flow control mechanism of the proposed policy, compared with state-of-the-art algorithms that lack integrated stream processing and data distribution control. 
    more » « less
  3. In a world where the number of smart cities is growing exponentially, there is a myriad of IoT devices which are generating immense data, 24×7. Centralized cloud data centers responsible for handling this huge data are being rapidly replaced with distributed edge nodes which move the computation closer to the users to provide low latencies for real-time applications. The proposed enhancements capitalizes on this design and proposes an effective way to achieve fault tolerance in the system. The concept of docker container migration is used to provide a near-zero downtime system on a distributed edge cloud architecture. An intuitively simple and visually attractive dashboard design is also being presented in this paper to remotely access the edge cloud management services. 
    more » « less
  4. Edge computing is an emerging computing paradigm representing decentralized and distributed information technology architecture [1] . The demand for edge computing is primarily driven by the increased number of smart devices and the Internet of Things (IoT) that generate and transmit a substantial amount of data, that would otherwise be stored on cloud computing services. The edge architecture enables data and computation to be performed in close proximity to users and data sources and acts as the pathway toward upstream data centers [2] . Rather than sending data to the cloud for processing, the analysis and work is done closer to where the source of the data is generated ( Figure 1 ). Edge services leverage local infrastructure resources allowing for reduced network latency, improved bandwidth utilization, and better energy efficiency compared to cloud computing. 
    more » « less
  5. null (Ed.)
    Edge data centers are an appealing place for telecommunication providers to offer in-network processing such as VPN services, security monitoring, and 5G. Placing these network services closer to users can reduce latency and core network bandwidth, but the deployment of network functions at the edge poses several important challenges. Edge data centers have limited resource capacity, yet network functions are re-source intensive with strict performance requirements. Replicating services at the edge is needed to meet demand, but balancing the load across multiple servers can be challenging due to diverse service costs, server and flow heterogeneity, and dynamic workload conditions. In this paper, we design and implement a model-based load balancer EdgeBalance for edge network data planes. EdgeBalance predicts the CPU demand of incoming traffic and adaptively distributes flows to servers to keep them evenly balanced. We overcome several challenges specific to network processing at the edge to improve throughput and latency over static load balancing and monitoring-based approaches. 
    more » « less