Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract Biased, incomplete numerical models are often used for forecasting states of complex dynamical systems by mapping an estimate of a “true” initial state into model phase space, making a forecast, and then mapping back to the “true” space. While advances have been made to reduce errors associated with model initialization and model forecasts, we lack a general framework for discovering optimal mappings between “true” dynamical systems and model phase spaces. Here, we propose using a data‐driven approach to infer these maps. Our approach consistently reduces errors in the Lorenz‐96 system with an imperfect model constructed to produce significant model errors compared to a reference configuration. Optimal pre‐ and post‐processing transforms leverage “shocks” and “drifts” in the imperfect model to make more skillful forecasts of the reference system. The implemented machine learning architecture using neural networks constructed with a custom analog‐adjoint layer makes the approach generalizable across applications.more » « less
-
Abstract The ensemble forecast dominates the computational cost of many data assimilation methods, especially for high‐resolution and coupled models. In situations where the cost is prohibitive, one can either use a lower‐cost model or a lower‐cost data assimilation method, or both. Ensemble optimal interpolation (EnOI) is a classical example of a lower‐cost ensemble data assimilation method that replaces the ensemble forecast with a single forecast and then constructs an ensemble about this single forecast by adding perturbations drawn from climatology. This research develops lower‐cost ensemble data assimilation methods that add perturbations to a single forecast, where the perturbations are obtained from analogs of the single model forecast. These analogs can either be found from a catalog of model states, constructed using linear combinations of model states from a catalog, or constructed using generative machine‐learning methods. Four analog ensemble data assimilation methods, including two new ones, are compared with EnOI in the context of a coupled model of intermediate complexity: Q‐GCM. Depending on the method and on the physical variable, analog methods can be up to 40% more accurate than EnOI.more » « less
-
Ensemble Kalman filters are an efficient class of algorithms for large-scale ensemble data assimilation, but their performance is limited by their underlying Gaussian approximation. A two-step framework for ensemble data assimilation allows this approximation to be relaxed: The first step updates the ensemble in observation space, while the second step regresses the observation state update back to the state variables. This paper develops a new quantile-conserving ensemble filter based on kernel-density estimation and quadrature for the scalar first step of the two-step framework. It is shown to perform well in idealized non-Gaussian problems, as well as in an idealized model of assimilating observations of sea-ice concentration.more » « less
An official website of the United States government
