skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Internet measurements on EdgeNet
We describe the deployment of an Internet measurement experiment to three testbeds that offer Linux containers hosted at widely distributed vantage points: the well-established PlanetLab Central and PlanetLab Europe platforms, and the new EdgeNet platform. The experiment results were published in the proceedings of ACM IMC 2018. We compare the capabilities of each testbed and their effect on the ease of deployment of the experiment. Because the software for this experiment has several library dependencies and requires a recent compiler, it was easiest to deploy on EdgeNet, which is based on Docker and Kubernetes. This extended abstract is accompanied by a demonstration of the reproducible deployment of a measurement tool on EdgeNet.  more » « less
Award ID(s):
1820901
PAR ID:
10097310
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
CNERT: COMPUTER AND NETWORKING EXPERIMENTAL RESEARCH USING TESTBEDS
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The DAMIC experiment employs large-area, thick charge-coupled devices (CCDs) to search for the interactions of low-mass dark matter particles in the galactic halo with silicon atoms in the CCD target. From 2017 to 2019, DAMIC collected data with a seven-CCD array (40-gram target) installed in the SNOLAB underground laboratory. We report dark-matter search results, including a conspicuous excess of events above the background model below 200 V_{ee} V e e , whose origin remains unknown. We present details of the published spectral analysis, and update on the deployment of skipper CCDs to perform a more precise measurement by early 2023. 
    more » « less
  2. Abstract The Accelerator Neutrino Neutron Interaction Experiment (ANNIE) is a 26-ton water Cherenkov neutrino detector installed on the Booster Neutrino Beam (BNB) at Fermilab. Its main physics goals are to perform a measurement of the neutron yield from neutrino-nucleus interactions, as well as a measurement of the charged-current cross section of muon neutrinos. An equally important focus is the research and development of new detector technologies and target media. Specifically, water-based liquid scintillator (WbLS) is of interest as a novel detector medium, as it allows for the simultaneous detection of Cherenkov light and scintillation. This paper presents the deployment of a 366 L WbLS vessel in ANNIE in March 2023 and the subsequent detection of both Cherenkov light and scintillation from the WbLS. This proof-of-concept allows for the future development of reconstruction and particle identification algorithms in ANNIE, as well as dedicated analyses within the WbLS volume, such as the search for neutral-current events and the hadronic scintillation component. 
    more » « less
  3. We present an experimental and simulation-based investigation of the temporal evolution of light emission from a thin, laser-ionized helium plasma source. We demonstrate an analytic model to calculate the approximate scaling of the time-integrated, on-axis light emission with the initial plasma density and temperature, supported by the experiment, which enhances the understanding of plasma light measurement for plasma wakefield accelerator (PWFA) plasma sources. Our model simulates the plasma density and temperature using a split-step Fourier code and a particle-in-cell code. A fluid simulation is then used to model the plasma and neutral density, and the electron temperature as a function of time and position. We then show the numerical results of the space-and-time-resolved light emission and that collisional excitation is the dominant source of light emission. We validate our model by measuring the light emitted by a laser-ionized plasma using a novel statistical method capable of resolving the nanosecond-scale temporal dynamics of the plasma light using a cost-effective camera with microsecond-scale timing jitter. This method is ideal for deployment in the high radiation environment of a particle accelerator that precludes the use of expensive nanosecond-gated cameras. Our results show that our models can effectively simulate the dynamics of a thin, laser-ionized plasma source. In addition, this work provides a detailed understanding of the plasma light measurement, which is one of the few diagnostic signals available for the direct measurement of PWFA plasma sources. 
    more » « less
  4. With the development and wide deployment of measurement equipment, data can be automatically measured and visualized for situation awareness in power systems. However, the cyber security of power systems is also threatened by data spoofing attacks. This letter proposed a measurement data source authentication (MDSA) algorithm based on feature extraction techniques including ensemble empirical mode decomposition (EEMD) and fast Fourier transform (FFT), and machine learning for real-time measurement data classification. Compared with previous work, the proposed algorithm can achieve higher accuracy of MDSA using a shorter window of data from closely located synchrophasor measurement sensors. 
    more » « less
  5. Inspired by prior work suggesting undetected errors were becoming a problem on the Internet, we set out to create a measurement system to detect errors that the TCP checksum missed. We designed a client-server framework in which the servers sent known files to clients. We then compared the received data with the original file to identify undetected errors introduced by the network. We deployed this measurement framework on various public testbeds. Over the course of 9 months, we transferred a total of 26 petabytes of data. Scaling the measurement framework to capture a large number of errors proved to be a challenge. This paper focuses on the challenges encountered during the deployment of the measurement system. We also present the interim results, which suggest that the error problems seen in prior works may be caused by two distinct processes: (1) errors that slip past TCP and (2) file system failures. The interim results also suggest that the measurement system needs to be adjusted to collect exabytes of measurement data, rather than the petabytes that prior studies predicted. 
    more » « less