skip to main content


Title: Inferring regional access network topologies: methods and applications
Using a toolbox of Internet cartography methods, and new ways of applying them, we have undertaken a comprehensive active measurement-driven study of the topology of U.S. regional access ISPs. We used state-of-the-art approaches in various combinations to accommodate the geographic scope, scale, and architectural richness of U.S. regional access ISPs. In addition to vantage points from research platforms, we used public WiFi hotspots and public transit of mobile devices to acquire the visibility needed to thoroughly map access networks across regions. We observed many different approaches to aggregation and redundancy, across links, nodes, buildings, and at different levels of the hierarchy. One result is substantial disparity in latency from some Edge COs to their backbone COs, with implications for end users of cloud services. Our methods and results can inform future analysis of critical infrastructure, including resilience to disasters, persistence of the digital divide, and challenges for the future of 5G and edge computing.  more » « less
Award ID(s):
1724853 1901517
NSF-PAR ID:
10351118
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Proceedings of the 21st ACM Internet Measurement Conference (IMC '21)
Page Range / eLocation ID:
720 to 738
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Microorganisms are ubiquitous in the biosphere, playing a crucial role in both biogeochemistry of the planet and human health. However, identifying these microorganisms and defining their function are challenging. Widely used approaches in comparative metagenomics, 16S amplicon sequencing and whole genome shotgun sequencing (WGS), have provided access to DNA sequencing analysis to identify microorganisms and evaluate diversity and abundance in various environments. However, advances in parallel high-throughput DNA sequencing in the past decade have introduced major hurdles, namely standardization of methods, data storage, reproducible interoperability of results, and data sharing. The National Ecological Observatory Network (NEON), established by the National Science Foundation, enables all researchers to address queries on a regional to continental scale around a variety of environmental challenges and provide high-quality, integrated, and standardized data from field sites across the U.S. As the amount of metagenomic data continues to grow, standardized procedures that allow results across projects to be assessed and compared is becoming increasingly important in the field of metagenomics. We demonstrate the feasibility of using publicly available NEON soil metagenomic sequencing datasets in combination with open access Metagenomics Rapid Annotation using the Subsystem Technology (MG-RAST) server to illustrate advantages of WGS compared to 16S amplicon sequencing. Four WGS and four 16S amplicon sequence datasets, from surface soil samples prepared by NEON investigators, were selected for comparison, using standardized protocols collected at the same locations in Colorado between April-July 2014. The dominant bacterial phyla detected across samples agreed between sequencing methodologies. However, WGS yielded greater microbial resolution, increased accuracy, and allowed identification of more genera of bacteria, archaea, viruses, and eukaryota, and putative functional genes that would have gone undetected using 16S amplicon sequencing. NEON open data will be useful for future studies characterizing and quantifying complex ecological processes associated with changing aquatic and terrestrial ecosystems. 
    more » « less
  2. The Border Gateway Protocol (BGP) is the protocol that networks use to exchange (announce) routing information across the Internet. Unfortunately, BGP has no mechanism to prevent unauthorized announcement of network addresses, also known as prefix hijacks. Since the 1990s, the primary means of protecting against unauthorized origin announcements has been the use of routing information databases, so that networks can verify prefix origin information they receive from their neighbors in BGP messages. In the 1990s, operators deployed databases now collectively known as the Internet Routing Registry (IRR), which depend on voluntary (although sometimes contractually required) contribution of routing information without strict (or sometimes any) validation. Coverage, accuracy, and use of these databases remains inconsistent across ISPs and over time. In 2012, after years of debate over approaches to improving routing security, the operator community deployed an alternative known as the Resource Public Key Infrastructure (RPKI). The RPKI includes cryptographic attestation of records, including expiration dates, with each Regional Internet Registry (RIR) operating as a "root" of trust. Similar to the IRR, operators can use the RPKI to discard routing messages that do not pass origin validation checks. But the additional integrity comes with complexity and cost. Furthermore, operational and legal implications of potential malfunctions have limited registration in and use of the RPKI. In response, some networks have redoubled their efforts to improve the accuracy of IRR registration data. These two technologies are now operating in parallel, along with the option of doing nothing at all to validate routes. Although RPKI use is growing, its limited coverage means that security-conscious operators may query both IRR and RPKI databases to maximize routing security. However, IRR information may be inaccurate due to improper hygiene, such as not updating the origin information after changes in routing policy or prefix ownership. Since RPKI uses a stricter registration and validation process, we use it as a baseline against which to compare the trends in accuracy and coverage of IRR data. 
    more » « less
  3. Abstract Aim

    Populations of cold‐adapted species at the trailing edges of geographic ranges are particularly vulnerable to the negative effects of climate change from the combination of exposure to warm temperatures and high sensitivity to heat. Many of these species are predicted to decline under future climate scenarios, but they could persist if they can adapt to warming climates either physiologically or behaviourally. We aim to understand local variation in contemporary habitat use and use this information to identify signs of adaptive capacity. We focus on moose (Alces alces), a charismatic species of conservation and public interest.

    Location

    The northeastern United States, along the trailing edge of the moose geographic range in North America.

    Methods

    We compiled data on occurrences and habitat use of moose from remote cameras and GPS collars across the northeastern United States. We use these data to build habitat suitability models at local and regional spatial scales and then to predict future habitat suitability under climate change. We also use fine‐scale GPS data to model relationships between habitat use and temperature on a daily temporal scale and to predict future habitat use.

    Results

    We find that habitat suitability for moose will decline under a range of climate change scenarios. However, moose across the region differ in their use of climatic and habitat space, indicating that they could exhibit adaptive capacity. We also find evidence for behavioural responses to weather, where moose increase their use of forested wetland habitats in warmer places and/or times.

    Main conclusions

    Our results suggest that there will be significant shifts in moose distribution due to climate change. However, if there is spatial variation in thermal tolerance, trailing‐edge populations could adapt to climate change. We highlight that prioritizing certain habitats for conservation (i.e., thermal refuges) could be crucial for this adaptation.

     
    more » « less
  4. Abstract. Land surface modellers need measurable proxies toconstrain the quantity of carbon dioxide (CO2) assimilated bycontinental plants through photosynthesis, known as gross primary production(GPP). Carbonyl sulfide (COS), which is taken up by leaves through theirstomates and then hydrolysed by photosynthetic enzymes, is a candidate GPPproxy. A former study with the ORCHIDEE land surface model used a fixedratio of COS uptake to CO2 uptake normalised to respective ambientconcentrations for each vegetation type (leaf relative uptake, LRU) tocompute vegetation COS fluxes from GPP. The LRU approach is known to havelimited accuracy since the LRU ratio changes with variables such asphotosynthetically active radiation (PAR): while CO2 uptake slows underlow light, COS uptake is not light limited. However, the LRU approach hasbeen popular for COS–GPP proxy studies because of its ease of applicationand apparent low contribution to uncertainty for regional-scaleapplications. In this study we refined the COS–GPP relationship andimplemented in ORCHIDEE a mechanistic model that describes COS uptake bycontinental vegetation. We compared the simulated COS fluxes againstmeasured hourly COS fluxes at two sites and studied the model behaviour andlinks with environmental drivers. We performed simulations at a global scale,and we estimated the global COS uptake by vegetation to be −756 Gg S yr−1,in the middle range of former studies (−490 to −1335 Gg S yr−1). Basedon monthly mean fluxes simulated by the mechanistic approach in ORCHIDEE, wederived new LRU values for the different vegetation types, ranging between0.92 and 1.72, close to recently published averages for observed values of1.21 for C4 and 1.68 for C3 plants. We transported the COS using the monthlyvegetation COS fluxes derived from both the mechanistic and the LRUapproaches, and we evaluated the simulated COS concentrations at NOAA sites.Although the mechanistic approach was more appropriate when comparing tohigh-temporal-resolution COS flux measurements, both approaches gave similarresults when transporting with monthly COS fluxes and evaluating COSconcentrations at stations. In our study, uncertainties between these twoapproaches are of secondary importance compared to the uncertainties in theCOS global budget, which are currently a limiting factor to the potential ofCOS concentrations to constrain GPP simulated by land surface models on theglobal scale. 
    more » « less
  5. Public cloud platforms are vital in supporting online applications for remote learning and telecommuting during the COVID-19 pandemic. The network performance between cloud regions and access networks directly impacts application performance and users' quality of experience (QoE). However, the location and network connectivity of vantage points often limits the visibility of edge-based measurement platforms (e.g., RIPE Atlas). We designed and implemented the CLoud-based Applications Speed Platform (CLASP) to measure performance to various networks from virtual machines in cloud regions with speed test servers that have been widely deployed on the Internet. In our five-month longitudinal measurements in Google Cloud Platform (GCP), we found that 30-70% of ISPs we measured showed severe throughput degradation from the peak throughput of the day. 
    more » « less