Abstract State estimation in multi-layer turbulent flow fields with only a single layer of partial observation remains a challenging yet practically important task. Applications include inferring the state of the deep ocean by exploiting surface observations. Directly implementing an ensemble Kalman filter based on the full forecast model is usually expensive. One widely used method in practice projects the information of the observed layer to other layers via linear regression. However, large errors appear when nonlinearity in the highly turbulent flow field becomes dominant. In this paper, we develop a multi-step nonlinear data assimilation method that involves the sequential application of nonlinear assimilation steps across layers. Unlike traditional linear regression approaches, a conditional Gaussian nonlinear system is adopted as the approximate forecast model to characterize the nonlinear dependence between adjacent layers. At each step, samples drawn from the posterior of the current layer are treated as pseudo-observations for the next layer. Each sample is assimilated using analytic formulae for the posterior mean and covariance. The resulting Gaussian posteriors are then aggregated into a Gaussian mixture. Therefore, the method can capture strongly turbulent features, particularly intermittency and extreme events, and more accurately quantify the inherent uncertainty. Applications to the two-layer quasi-geostrophic system with Lagrangian data assimilation demonstrate that the multi-step method outperforms the one-step method, particularly as the tracer number and ensemble size increase. Results also show that the multi-step CGDA is particularly effective for assimilating frequent, high-accuracy observations, which are scenarios where traditional EnKF methods may suffer from catastrophic filter divergence.
more »
« less
A Quantile-Conserving Ensemble Filter Based on Kernel-Density Estimation
Ensemble Kalman filters are an efficient class of algorithms for large-scale ensemble data assimilation, but their performance is limited by their underlying Gaussian approximation. A two-step framework for ensemble data assimilation allows this approximation to be relaxed: The first step updates the ensemble in observation space, while the second step regresses the observation state update back to the state variables. This paper develops a new quantile-conserving ensemble filter based on kernel-density estimation and quadrature for the scalar first step of the two-step framework. It is shown to perform well in idealized non-Gaussian problems, as well as in an idealized model of assimilating observations of sea-ice concentration.
more »
« less
- Award ID(s):
- 2152814
- PAR ID:
- 10529841
- Publisher / Repository:
- MDPI
- Date Published:
- Journal Name:
- Remote Sensing
- Volume:
- 16
- Issue:
- 13
- ISSN:
- 2072-4292
- Page Range / eLocation ID:
- 2377
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract For data assimilation to provide faithful state estimates for dynamical models, specifications of observation uncertainty need to be as accurate as possible. Innovation-based methods based on Desroziers diagnostics, are commonly used to estimate observation uncertainty, but such methods can depend greatly on the prescribed background uncertainty. For ensemble data assimilation, this uncertainty comes from statistics calculated from ensemble forecasts, which require inflation and localization to address under sampling. In this work, we use an ensemble Kalman filter (EnKF) with a low-dimensional Lorenz model to investigate the interplay between the Desroziers method and inflation. Two inflation techniques are used for this purpose: 1) a rigorously tuned fixed multiplicative scheme and 2) an adaptive state-space scheme. We document how inaccuracies in observation uncertainty affect errors in EnKF posteriors and study the combined impacts of misspecified initial observation uncertainty, sampling error, and model error on Desroziers estimates. We find that whether observation uncertainty is over- or underestimated greatly affects the stability of data assimilation and the accuracy of Desroziers estimates and that preference should be given to initial overestimates. Inline estimates of Desroziers tend to remove the dependence between ensemble spread–skill and the initially prescribed observation error. In addition, we find that the inclusion of model error introduces spurious correlations in observation uncertainty estimates. Further, we note that the adaptive inflation scheme is less robust than fixed inflation at mitigating multiple sources of error. Last, sampling error strongly exacerbates existing sources of error and greatly degrades EnKF estimates, which translates into biased Desroziers estimates of observation error covariance. Significance StatementTo generate accurate predictions of various components of the Earth system, numerical models require an accurate specification of state variables at our current time. This step adopts a probabilistic consideration of our current state estimate versus information provided from environmental measurements of the true state. Various strategies exist for estimating uncertainty in observations within this framework, but are sensitive to a host of assumptions, which are investigated in this study.more » « less
-
Abstract Obtaining a faithful probabilistic depiction of moist convection is complicated by unknown errors in subgrid-scale physical parameterization schemes, invalid assumptions made by data assimilation (DA) techniques, and high system dimensionality. As an initial step toward untangling sources of uncertainty in convective weather regimes, we evaluate a novel Bayesian data assimilation methodology based on particle filtering within a WRF ensemble analysis and forecasting system. Unlike most geophysical DA methods, the particle filter (PF) represents prior and posterior error distributions nonparametrically rather than assuming a Gaussian distribution and can accept any type of likelihood function. This approach is known to reduce bias introduced by Gaussian approximations in low-dimensional and idealized contexts. The form of PF used in this research adopts a dimension-reduction strategy, making it affordable for typical weather applications. The present study examines posterior ensemble members and forecasts for select severe weather events between 2019 and 2020, comparing results from the PF with those from an ensemble Kalman filter (EnKF). We find that assimilating with a PF produces posterior quantities for microphysical variables that are more consistent with model climatology than comparable quantities from an EnKF, which we attribute to a reduction in DA bias. These differences are significant enough to impact the dynamic evolution of convective systems via cold pool strength and propagation, with impacts to forecast verification scores depending on the particular microphysics scheme. Our findings have broad implications for future approaches to the selection of physical parameterization schemes and parameter estimation within preexisting data assimilation frameworks. Significance StatementThe accurate prediction of severe storms using numerical weather models depends on effective parameterization schemes for small-scale processes and the assimilation of incomplete observational data in a manner that faithfully represents the probabilistic state of the atmosphere. Current generation methods for data assimilation typically assume a standard form for the error distributions of relevant quantities, which can introduce bias that not only hinders numerical prediction, but that can also confound the characterization of errors from the model itself. The current study performs data assimilation using a novel method that does not make such assumptions and explores characteristics of resulting model fields and forecasts that might make such a method useful for improving model parameterization schemes.more » « less
-
Abstract Particle filters avoid parametric estimates for Bayesian posterior densities, which alleviates Gaussian assumptions in nonlinear regimes. These methods, however, are more sensitive to sampling errors than Gaussian-based techniques such as ensemble Kalman filters. A recent study by the authors introduced an iterative strategy for particle filters that match posterior moments—where iterations improve the filter’s ability to draw samples from non-Gaussian posterior densities. The iterations follow from a factorization of particle weights, providing a natural framework for combining particle filters with alternative filters to mitigate the impact of sampling errors. The current study introduces a novel approach to forming an adaptive hybrid data assimilation methodology, exploiting the theoretical strengths of nonparametric and parametric filters. At each data assimilation cycle, the iterative particle filter performs a sequence of updates while the prior sample distribution is non-Gaussian, then an ensemble Kalman filter provides the final adjustment when Gaussian distributions for marginal quantities are detected. The method employs the Shapiro–Wilk test to determine when to make the transition between filter algorithms, which has outstanding power for detecting departures from normality. Experiments using low-dimensional models demonstrate that the approach has a significant value, especially for nonhomogeneous observation networks and unknown model process errors. Moreover, hybrid factors are extended to consider marginals of more than one collocated variables using a test for multivariate normality. Findings from this study motivate the use of the proposed method for geophysical problems characterized by diverse observation networks and various dynamic instabilities, such as numerical weather prediction models. Significance Statement Data assimilation statistically processes observation errors and model forecast errors to provide optimal initial conditions for the forecast, playing a critical role in numerical weather forecasting. The ensemble Kalman filter, which has been widely adopted and developed in many operational centers, assumes Gaussianity of the prior distribution and solves a linear system of equations, leading to bias in strong nonlinear regimes. On the other hand, particle filters avoid many of those assumptions but are sensitive to sampling errors and are computationally expensive. We propose an adaptive hybrid strategy that combines their advantages and minimizes the disadvantages of the two methods. The hybrid particle filter–ensemble Kalman filter is achieved with the Shapiro–Wilk test to detect the Gaussianity of the ensemble members and determine the timing of the transition between these filter updates. Demonstrations in this study show that the proposed method is advantageous when observations are heterogeneous and when the model has an unknown bias. Furthermore, by extending the statistical hypothesis test to the test for multivariate normality, we consider marginals of more than one collocated variable. These results encourage further testing for real geophysical problems characterized by various dynamic instabilities, such as real numerical weather prediction models.more » « less
-
Abstract Iterative ensemble filters and smoothers are now commonly used for geophysical models. Some of these methods rely on a factorization of the observation likelihood function to sample from a posterior density through a set of “tempered” transitions to ensemble members. For Gaussian‐based data assimilation methods, tangent linear versions of nonlinear operators can be relinearized between iterations, thus leading to a solution that is less biased than a single‐step approach. This study adopts similar iterative strategies for a localized particle filter (PF) that relies on the estimation of moments to adjust unobserved variables based on importance weights. This approach builds off a “regularization” of the local PF, which forces weights to be more uniform through heuristic means. The regularization then leads to an adaptive tempering, which can also be combined with filter updates from parametric methods, such as ensemble Kalman filters. The role of iterations is analyzed by deriving the localized posterior probability density assumed by current local PF formulations and then examining how single‐step and tempered PFs sample from this density. From experiments performed with a low‐dimensional nonlinear system, the iterative and hybrid strategies show the largest benefits in observation‐sparse regimes, where only a few particles contain high likelihoods and prior errors are non‐Gaussian. This regime mimics specific applications in numerical weather prediction, where small ensemble sizes, unresolved model error, and highly nonlinear dynamics lead to prior uncertainty that is larger than measurement uncertainty.more » « less
An official website of the United States government

