skip to main content


Title: Real-time quality control of optical backscattering data from Biogeochemical-Argo floats [version 1; peer review: awaiting peer review]
Background: Biogeochemical-Argo floats are collecting an unprecedented number of profiles of optical backscattering measurements in the global ocean. Backscattering (BBP) data are crucial to understanding ocean particle dynamics and the biological carbon pump. Yet, so far, no procedures have been agreed upon to quality control BBP data in real time. Methods: Here, we present a new suite of real-time quality-control tests and apply them to the current global BBP Argo dataset. The tests were developed by expert BBP users and Argo data managers and have been implemented on a snapshot of the entire Argo dataset. Results: The new tests are able to automatically flag most of the “bad” BBP profiles from the raw dataset. Conclusions: The proposed tests have been approved by the Biogeochemical-Argo Data Management Team and will be implemented by the Argo Data Assembly Centres to deliver real-time quality-controlled profiles of optical backscattering. Provided they reach a pressure of about 1000 dbar, these tests could also be applied to BBP profiles collected by other platforms.  more » « less
Award ID(s):
1946578 2110258
NSF-PAR ID:
10378033
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
Open research Europe
Volume:
2
Issue:
118
ISSN:
2732-5121
Page Range / eLocation ID:
https://open-research-europe.ec.europa.eu/articles
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Background: Biogeochemical-Argo floats are collecting an unprecedented number of profiles of optical backscattering measurements in the global ocean. Backscattering (BBP) data are crucial to understanding ocean particle dynamics and the biological carbon pump. Yet, so far, no procedures have been agreed upon to quality control BBP data in real time. Methods: Here, we present a new suite of real-time quality-control tests and apply them to the current global BBP Argo dataset. The tests were developed by expert BBP users and Argo data managers and have been implemented on a snapshot of the entire Argo dataset. Results: The new tests are able to automatically flag most of the “bad” BBP profiles from the raw dataset. Conclusions: The proposed tests have been approved by the Biogeochemical-Argo Data Management Team and will be implemented by the Argo Data Assembly Centres to deliver real-time quality-controlled profiles of optical backscattering. Provided they reach a pressure of about 1000 dbar, these tests could also be applied to BBP profiles collected by other platforms. 
    more » « less
  2. Background: Biogeochemical-Argo floats are collecting an unprecedented number of profiles of optical backscattering measurements in the global ocean. Backscattering (BBP) data are crucial to understanding ocean particle dynamics and the biological carbon pump. Yet, so far, no procedures have been agreed upon to quality control BBP data in real time. Methods: Here, we present a new suite of real-time quality-control tests and apply them to the current global BBP Argo dataset. The tests were developed by expert BBP users and Argo data managers and have been implemented on a snapshot of the entire Argo dataset. Results: The new tests are able to automatically flag most of the “bad” BBP profiles from the raw dataset. Conclusions: The proposed tests have been approved by the Biogeochemical-Argo Data Management Team and will be implemented by the Argo Data Assembly Centres to deliver real-time quality-controlled profiles of optical backscattering. Provided they reach a pressure of about 1000 dbar, these tests could also be applied to BBP profiles collected by other platforms. 
    more » « less
  3. Millions of in situ ocean temperature profiles have been collected historically using various instrument types with varying sensor accuracy and then assembled into global databases. These are essential to our current understanding of the changing state of the oceans, sea level, Earth’s climate, marine ecosystems and fisheries, and for constraining model projections of future change that underpin mitigation and adaptation solutions. Profiles distributed shortly after collection are also widely used in operational applications such as real-time monitoring and forecasting of the ocean state and weather prediction. Before use in scientific or societal service applications, quality control (QC) procedures need to be applied to flag and ultimately remove erroneous data. Automatic QC (AQC) checks are vital to the timeliness of operational applications and for reducing the volume of dubious data which later require QC processing by a human for delayed mode applications. Despite the large suite of evolving AQC checks developed by institutions worldwide, the most effective set of AQC checks was not known. We have developed a framework to assess the performance of AQC checks, under the auspices of the International Quality Controlled Ocean Database (IQuOD) project. The IQuOD-AQC framework is an open-source collaborative software infrastructure built in Python (available from https://github.com/IQuOD ). Sixty AQC checks have been implemented in this framework. Their performance was benchmarked against three reference datasets which contained a spectrum of instrument types and error modes flagged in their profiles. One of these (a subset of the Quality-controlled Ocean Temperature Archive (QuOTA) dataset that had been manually inspected for quality issues by its creators) was also used to identify optimal sets of AQC checks. Results suggest that the AQC checks are effective for most historical data, but less so in the case of data from Mechanical Bathythermographs (MBTs), and much less effective for Argo data. The optimal AQC sets will be applied to generate quality flags for the next release of the IQuOD dataset. This will further elevate the quality and historical value of millions of temperature profile data which have already been improved by IQuOD intelligent metadata and observational uncertainty information ( https://doi.org/10.7289/v51r6nsf ). 
    more » « less
  4. Abstract Global estimates of absolute velocities can be derived from Argo float trajectories during drift at parking depth. A new velocity dataset developed and maintained at Scripps Institution of Oceanography is presented based on all Core, Biogeochemical, and Deep Argo float trajectories collected between 2001 and 2020. Discrepancies between velocity estimates from the Scripps dataset and other existing products including YoMaHa and ANDRO are associated with quality control criteria, as well as selected parking depth and cycle time. In the Scripps product, over 1.3 million velocity estimates are used to reconstruct a time-mean velocity field for the 800–1200 dbar layer at 1° horizontal resolution. This dataset provides a benchmark to evaluate the veracity of the BRAN2020 reanalysis in representing the observed variability of absolute velocities and offers a compelling opportunity for improved characterization and representation in forecast and reanalysis systems. Significance Statement The aim of this study is to provide observation-based estimates of the large-scale, subsurface ocean circulation. We exploit the drift of autonomous profiling floats to carefully isolate the inferred circulation at the parking depth, and combine observations from over 11 000 floats, sampling between 2001 and 2020, to deliver a new dataset with unprecedented accuracy. The new estimates of subsurface currents are suitable for assessing global models, reanalyses, and forecasts, and for constraining ocean circulation in data-assimilating models. 
    more » « less
  5. Abstract Since the mid-2000s, the Argo oceanographic observational network has provided near-real-time four-dimensional data for the global ocean for the first time in history. Internet (i.e., the “web”) applications that handle the more than two million Argo profiles of ocean temperature, salinity, and pressure are an active area of development. This paper introduces a new and efficient interactive Argo data visualization and delivery web application named Argovis that is built on a classic three-tier design consisting of a front end, back end, and database. Together these components allow users to navigate 4D data on a world map of Argo floats, with the option to select a custom region, depth range, and time period. Argovis’s back end sends data to users in a simple format, and the front end quickly renders web-quality figures. More advanced applications query Argovis from other programming environments, such as Python, R, and MATLAB. Our Argovis architecture allows expert data users to build their own functionality for specific applications, such as the creation of spatially gridded data for a given time and advanced time–frequency analysis for a space–time selection. Argovis is aimed to both scientists and the public, with tutorials and examples available on the website, describing how to use the Argovis data delivery system—for example, how to plot profiles in a region over time or to monitor profile metadata. 
    more » « less