skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Estimating the storage term in eddy covariance measurements: the ICOS methodology
Abstract In eddy covariance measurements, the storage flux represents the variation in time of the dry molar fraction of a given gas in the control volume representative of turbulent flux. Depending on the time scale considered, and on the height above ground of the measurements, it can either be a major component of the overall net ecosystem exchange or nearly negligible. Instrumental configuration and computational procedures must be optimized to measure this change at the time step used for the turbulent flux measurement. Three different configurations are suitable within the Integrated Carbon Observation System infrastructure for the storage flux determination: separate sampling, subsequent sampling and mixed sampling. These configurations have their own advantages and disadvantages, and must be carefully selected based on the specific features of the considered station. In this paper, guidelines about number and distribution of vertical and horizontal sampling points are given. Details about suitable instruments, sampling devices, and computational procedures for the quantification of the storage flux of different GHG gases are also provided.  more » « less
Award ID(s):
1724433
PAR ID:
10376516
Author(s) / Creator(s):
; ; ; ; ; ; ;
Date Published:
Journal Name:
International Agrophysics
Volume:
32
Issue:
4
ISSN:
2300-8725
Page Range / eLocation ID:
551 to 567
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract. Low-level flights over tundra wetlands in Alaska and Canada have beenconducted during the Airborne Measurements of Methane Emissions (AirMeth) campaigns to measure turbulent methane fluxesin the atmosphere. In this paper we describe the instrumentation and newcalibration procedures for the essential pressure parameters required forturbulence sensing by aircraft that exploit suitable regular measurementflight legs without the need for dedicated calibration patterns. We estimatethe accuracy of the mean wind and the turbulence measurements. We show thatairborne measurements of turbulent fluxes of methane and carbon dioxide usingcavity ring-down spectroscopy trace gas analysers together with establishedturbulence equipment achieve a relative accuracy similar to that ofmeasurements of sensible heat flux if applied during low-level flights overnatural area sources. The inertial subrange of the trace gas fluctuationscannot be resolved due to insufficient high-frequency precision of theanalyser, but, since this scatter is uncorrelated with the vertical windvelocity, the covariance and thus the flux are reproduced correctly. In thecovariance spectra the -7/3 drop-off in the inertial subrange can bereproduced if sufficient data are available for averaging. For convectiveconditions and flight legs of several tens of kilometres we estimate the fluxdetection limit to be about4 mg m−2 d−1 forw′CH4′‾,1.4 g m−2 d−1 for w′CO2′‾ and4.2 W m−2 for the sensible heat flux. 
    more » « less
  2. Abstract The rise of exascale supercomputing has motivated an increase in high‐fidelity computational fluid dynamics (CFD) simulations. The detail in these simulations, often involving shape‐dependent, time‐variant flow domains and low‐speed, complex, turbulent flows, is essential for fueling innovations in fields like wind, civil, automotive, or aerospace engineering. However, the massive amount of data these simulations produce can overwhelm storage systems and negatively affect conventional data management and postprocessing workflows, including iterative procedures such as design space exploration, optimization, and uncertainty quantification. This study proposes a novel sampling method harnessing the signed distance function (SDF) concept: SDF‐biased flow importance sampling (BiFIS) and implicit compression based on implicit neural network representations for transforming large‐size, shape‐dependent flow fields into reduced‐size shape‐agnostic images. Designed to alleviate the above‐mentioned problems, our approach achieves near‐lossless compression ratios of approximately :, reducing the size of a bridge aerodynamics forced‐vibration simulation from roughly to about while maintaining low reproduction errors, in most cases below , which is unachievable with other sampling approaches. Our approach also allows for real‐time analysis and visualization of these massive simulations and does not involve decompression preprocessing steps that yield full simulation data again. Given that image sampling is a fundamental step for any image‐based flow field prediction model, the proposed BiFIS method can significantly improve the accuracy and efficiency of such models, helping any application that relies on precise flow field predictions. The BiFIS code is available onGitHub. 
    more » « less
  3. Radiative transfer through clouds can be impacted by variations in particle number size distribution, but also in particle spatial distribution. Due to turbulent mixing and inertial effects, spatial correlations often exist, even on scales reaching the cloud droplet separation distance. The resulting clusters and voids within the droplet field can lead to deviations from exponential extinction. Prior work has numerically investigated these departures from exponential attenuation in absorptive and scattering media; this work takes a step towards determining the feasibility of detecting departures from exponential behavior due to spatial correlation in turbulent clouds generated in a laboratory setting. Large Eddy Simulation (LES) is used to mimic turbulent mixing clouds generated in a laboratory convection cloud chamber. Light propagation through the resulting polydisperse and spatially correlated particle fields is explored via Monte Carlo ray tracing simulations. The key finding is that both mean radiative flux and standard deviation about the mean differ when correlations exist, suggesting that an experiment using a laboratory convection cloud chamber could be designed to investigate non-exponential behavior. Total forward flux is largely unchanged (due to scattering being highly forward-dominant for the size parameters considered), allowing it to be used for conditional sampling based on optical thickness. Direct and diffuse forward flux means are modified by approximately one standard deviation. Standard deviations of diffuse forward and backward fluxes are strongly enhanced, suggesting that fluctuations in the scattered light are a more sensitive metric to consider. The results also suggest the possibility that measurements of radiative transfer could be used to infer the strength and scales of correlations in a turbulent cloud, indicating entrainment and mixing effects. 
    more » « less
  4. Chechik, M.; Katoen, JP.; Leucker, M. (Ed.)
    Efficient verification algorithms for neural networks often depend on various abstract domains such as intervals, zonotopes, and linear star sets. The choice of the abstract domain presents an expressiveness vs. scalability trade-off: simpler domains are less precise but yield faster algorithms. This paper investigates the octatope abstract domain in the context of neural net verification. Octatopes are affine transformations of n-dimensional octagons—sets of unit-two-variable-per-inequality (UTVPI) constraints. Octatopes generalize the idea of zonotopes which can be viewed as an affine transformation of a box. On the other hand, octatopes can be considered as a restriction of linear star set, which are affine transformations of arbitrary H-Polytopes. This distinction places octatopes firmly between zonotopes and star sets in their expressive power, but what about the efficiency of decision procedures? An important analysis problem for neural networks is the exact range computation problem that asks to compute the exact set of possible outputs given a set of possible inputs. For this, three computational procedures are needed: 1) optimization of a linear cost function; 2) affine mapping; and 3) over-approximating the intersection with a half-space. While zonotopes allow an efficient solution for these approaches, star sets solves these procedures via linear programming. We show that these operations are faster for octatopes than the more expressive linear star sets. For octatopes, we reduce these problems to min-cost flow problems, which can be solved in strongly polynomial time using the Out-of-Kilter algorithm. Evaluating exact range computation on several ACAS Xu neural network benchmarks, we find that octatopes show promise as a practical abstract domain for neural network verification. 
    more » « less
  5. Agriculture is being called upon to increase carbon (C) storage in soils to reduce greenhouse gas (GHG) accumulation in the atmosphere. Cropping systems research can be used to support GHG mitigation efforts, but we must quantify land management impacts using appropriate assumptions and unambiguous methods. Soil C sequestration is considered temporary because it can be re-emitted as carbon dioxide (CO2) if the effecting practice is not maintained and/or the soil–plant system is disturbed, for example, as the result of changing climate. Because of this, the climate benefit of soil C sequestration depends on the time that C is held out of the atmosphere. When assessing the net GHG impact of management practices, soil C storage is often aggregated with non-CO2 (N2O and CH4) emissions after converting all components to CO2 equivalents (CO2e) and assuming a given time horizon (TH), in what is known as stock change accounting. However, such analyses do not consider potential re-emission of soil C or apply consistent assumptions about time horizons. Here, we demonstrate that tonne-year accounting provides a more conservative estimate of the emissions offsetting potential of soil C storage compared to stock change accounting. Tonne-year accounting can be used to reconcile differences in the context and timeframes of soil C sequestration and non-CO2 GHG emissions. The approach can be applied post hoc to commonly observed cropping systems data to estimate GHG emissions offsets associated with agricultural land management over given THs and with more clearly defined assumptions. 
    more » « less