Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
We investigate a novel approach to the use of jitter to infer network congestion using data collected by probes in access networks. We discovered a set of features in jitter and jitter dispersion —a jitter-derived time series we define in this paper—time series that are characteristic of periods of congestion. We leverage these concepts to create a jitter-based congestion inference framework that we call Jitterbug. We apply Jitterbug’s capabilities to a wide range of traffic scenarios and discover that Jitterbug can correctly identify both recurrent and one-off congestion events. We validate Jitterbug inferences against state-of-the-art autocorrelation-based inferences of recurrent congestion. We find that the two approaches have strong congruity in their inferences, but Jitterbug holds promise for detecting one-off as well as recurrent congestion. We identify several future directions for this research including leveraging ML/AI techniques to optimize performance and accuracy of this approach in operational settings.more » « less
-
The goal of this paper is to offer framing for conversations about the role of measurement in informing public policy about the Internet, the barriers to gathering measurements, public policy challenges that are creating pressure for reform in this space, and recommended actions that could facilitate gathering of measurements to support policy-making.more » « less
-
This paper is concerned with the global routing system of the Internet, its serious and persistent vulnerabilities, the multi-disciplinary considerations shaping this space, and recent trends that finally improve its security. We report real-world evidence of the state of routing security with the goal of discussing possible options for better security. We offer two high-level conclusions. First, in designing an approach it is all-important to understand complexities and incentives for deployment. Second, any scheme that is picked for deployment will take time to have an effect, so having a realistic view of how quickly change can be accomplished is essential.more » « less
-
Interconnection links connecting broadband access providers with their peers, transit providers and major content providers, are a potential point of discriminatory treatment and impairment of user experience. However, adequate data to shed light on this situation is lacking, and different actors can put forward opportunistic interpretations of data to support their points of view. In this article, we introduce a topology-aware model of interconnection to elucidate our own beliefs about how to measure interconnection links of access providers and how policy- makers should interpret the results. We use six case studies that show how our conceptual model can guide a critical analysis of what is or should be measured and reported, and how to soundly interpret these measurements.more » « less
-
One foundational justification for regulatory intervention is that there are harms occurring of a character that create a public interest in mitigating them. This paper is concerned with such harms that arise in the Internet ecosystem. Looking at news headlines for the last few years, it may seem that the range of such harms is unbounded. Hoping to add some order to the chaos, we undertake an effort to classify harms in the Internet ecosystem, in pursuit of a more or less complete taxonomy of harms. Our goal in structuring this taxonomy can help to mitigate harms in a more systematic way, as opposed to fighting an endless defensive battle against whatever happens next. The background we bring to this paper is on the one hand architectural—how the Internet ecosystem is actually structured—and on the other hand empirical—how we should measure the Internet to best understand what is happening. If everything were wonderful about the Internet today, the need to measure and understand would not be so compelling. A justification for measurement follows from its ability to shed light on problems and challenges. Sustained measurement or compelled reporting of data, and the analysis of the collected data, generally comes at considerable effort and cost, so must be justified by an argument that it will shed light on something important. This reasoning naturally motivates our taxonomy of things that are wrong—what we call harms. That is where we, the research community generally, and governments should focus attention. We do not intend this paper as a catalog of pessimism, but to help define an action agenda for the research community and for governments. The structure of the paper proceeds "up the layers'', from technology to society. For harms that are closer to the technology, we can be more specific about the harms, and more specific about possible measurements and remedies, and actors that could undertake them. One motivation for this paper is that we believe the Internet ecosystem is at an inflection point. The Internet has revolutionized our ability to store, move, and process information, including information about people, and we are only at the beginning of understanding its impact on society and how to manage and mitigate harms resulting from unregulated commercial use of these capabilities. Current events suggest that now is a point of transition from laissez-faire to regulation. However, the path to good regulation is not obvious, and now is the time for the research community to think hard about what advice to give the governments of the world, and what sort of data can back up that advice. Our highest-level goal for this paper is to contribute to a conversation along those lines.more » « less
-
There is significant interest in the technical and policy communities regarding the extent, scope, and consumer harm of persistent interdomain congestion. We provide empirical grounding for discussions of interdomain congestion by developing a system and method to measure congestion on thousands of interdomain links without direct access to them. We implement a system based on the Time Series Latency Probes (TSLP) technique that identifies links with evidence of recurring congestion suggestive of an under-provisioned link. We deploy our system at 86 vantage points worldwide and show that congestion inferred using our lightweight TSLP method correlates with other metrics of interconnection performance impairment. We use our method to study interdomain links of eight large U.S. broadband access providers from March 2016 to December 2017, and validate our inferences against ground-truth traffic statistics from two of the providers. For the period of time over which we gathered measurements, we did not find evidence of widespread endemic congestion on interdomain links between access ISPs and directly connected transit and content providers, although some such links exhibited recurring congestion patterns. We describe limitations, open challenges, and a path toward the use of this method for large-scale third-party monitoring of the Internet interconnection ecosystem.more » « less