skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: When is Growth at Risk?
This paper empirically evaluates the potentially non-linear nexus between financial indicators and the distribution of future GDP growth, using a rich set of macroeconomic and financial variables covering 13 advanced economies. We evaluate the out-of-sample forecast performance of financial variables for GDP growth, including a fully real-time exercise based on a flexible non-parametric model. We also use a parametric model to estimate the moments of the time-varying distribution of GDP and evaluate their in-sample estimation uncertainty. Our overall conclusion is pessimistic: Moments other than the conditional mean are poorly estimated, and no predictors we consider provide robust and precise advance warnings of tail risks or indeed about any features of the GDP growth distribution other than the mean. In particular, financial variables contribute little to such distributional forecasts, beyond the information contained in real indicators.  more » « less
Award ID(s):
1851665
PAR ID:
10177094
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Brookings papers on economic activity
ISSN:
0007-2303
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Species distribution models (SDMs) are a commonly used tool, which when combined with earth system models (ESMs), can project changes in organismal occurrence, abundance, and phenology under climate change. An often untested assumption of SDMs is that relationships between organisms and the environment are stationary. To evaluate this assumption, we examined whether patterns of distribution among larvae of four small pelagic fishes (Pacific sardine Sardinops sagax , northern anchovy Engraulis mordax , jack mackerel Trachurus symmetricus , chub mackerel Scomber japonicus ) in the California Current remained steady across time periods defined by climate regimes, changes in secondary productivity, and breakpoints in time series of spawning stock biomass (SSB). Generalized additive models (GAMs) were constructed separately for each period using temperature, salinity, dissolved oxygen (DO), and mesozooplankton volume as predictors of larval occurrence. We assessed non-stationarity based on changes in six metrics: 1) variables included in SDMs; 2) whether a variable exhibited a linear or non-linear form; 3) rank order of deviance explained by variables; 4) response curve shape; 5) degree of responsiveness of fishes to a variable; 6) range of environmental variables associated with maximum larval occurrence. Across all species and time periods, non-stationarity was ubiquitous, affecting at least one of the six indicators. Rank order of environmental variables, response curve shape, and oceanic conditions associated with peak larval occurrence were the indicators most subject to change. Non-stationarity was most common among regimes defined by changes in fish SSB. The relationships between larvae and DO were somewhat more likely to change across periods, whereas the relationships between fishes and temperature were more stable. Respectively, S. sagax , T. symmetricus , S. japonicus , and E. mordax exhibited non-stationarity across 89%, 67%, 50%, and 50% of indicators. For all species except E. mordax , inter-model variability had a larger impact on projected habitat suitability for larval fishes than differences between two climate change scenarios (SSP1-2.6 and SSP5-8.5), implying that subtle differences in model formulation could have amplified future effects. These results suggest that the widespread non-stationarity in how fishes utilize their environment could hamper our ability to reliably project how species will respond to climatic change. 
    more » « less
  2. Real-time forecasting of non-stationary time series is a challenging problem, especially when the time series evolves rapidly. For such cases, it has been observed that ensemble models consisting of a diverse set of model classes can perform consistently better than individual models. In order to account for the nonstationarity of the data and the lack of availability of training examples, the models are retrained in real-time using the most recent observed data samples. Motivated by the robust performance properties of ensemble models, we developed a Bayesian model averaging ensemble technique consisting of statistical, deep learning, and compartmental models for fore-casting epidemiological signals, specifically, COVID-19 signals. We observed the epidemic dynamics go through several phases (waves). In our ensemble model, we observed that different model classes performed differently during the various phases. Armed with this understanding, in this paper, we propose a modification to the ensembling method to employ this phase information and use different weighting schemes for each phase to produce improved forecasts. However, predicting the phases of such time series is a significant challenge, especially when behavioral and immunological adaptations govern the evolution of the time series. We explore multiple datasets that can serve as leading indicators of trend changes and employ transfer entropy techniques to capture the relevant indicator. We propose a phase prediction algorithm to estimate the phases using the leading indicators. Using the knowledge of the estimated phase, we selectively sample the training data from similar phases. We evaluate our proposed methodology on our currently deployed COVID-19 forecasting model and the COVID-19 ForecastHub models. The overall performance of the proposed model is consistent across the pandemic. More importantly, it is ranked second during two critical rapid growth phases in cases, regimes where the performance of most models from the ForecastHub dropped significantly. 
    more » « less
  3. Abstract Particle filters avoid parametric estimates for Bayesian posterior densities, which alleviates Gaussian assumptions in nonlinear regimes. These methods, however, are more sensitive to sampling errors than Gaussian-based techniques such as ensemble Kalman filters. A recent study by the authors introduced an iterative strategy for particle filters that match posterior moments—where iterations improve the filter’s ability to draw samples from non-Gaussian posterior densities. The iterations follow from a factorization of particle weights, providing a natural framework for combining particle filters with alternative filters to mitigate the impact of sampling errors. The current study introduces a novel approach to forming an adaptive hybrid data assimilation methodology, exploiting the theoretical strengths of nonparametric and parametric filters. At each data assimilation cycle, the iterative particle filter performs a sequence of updates while the prior sample distribution is non-Gaussian, then an ensemble Kalman filter provides the final adjustment when Gaussian distributions for marginal quantities are detected. The method employs the Shapiro–Wilk test to determine when to make the transition between filter algorithms, which has outstanding power for detecting departures from normality. Experiments using low-dimensional models demonstrate that the approach has a significant value, especially for nonhomogeneous observation networks and unknown model process errors. Moreover, hybrid factors are extended to consider marginals of more than one collocated variables using a test for multivariate normality. Findings from this study motivate the use of the proposed method for geophysical problems characterized by diverse observation networks and various dynamic instabilities, such as numerical weather prediction models. Significance Statement Data assimilation statistically processes observation errors and model forecast errors to provide optimal initial conditions for the forecast, playing a critical role in numerical weather forecasting. The ensemble Kalman filter, which has been widely adopted and developed in many operational centers, assumes Gaussianity of the prior distribution and solves a linear system of equations, leading to bias in strong nonlinear regimes. On the other hand, particle filters avoid many of those assumptions but are sensitive to sampling errors and are computationally expensive. We propose an adaptive hybrid strategy that combines their advantages and minimizes the disadvantages of the two methods. The hybrid particle filter–ensemble Kalman filter is achieved with the Shapiro–Wilk test to detect the Gaussianity of the ensemble members and determine the timing of the transition between these filter updates. Demonstrations in this study show that the proposed method is advantageous when observations are heterogeneous and when the model has an unknown bias. Furthermore, by extending the statistical hypothesis test to the test for multivariate normality, we consider marginals of more than one collocated variable. These results encourage further testing for real geophysical problems characterized by various dynamic instabilities, such as real numerical weather prediction models. 
    more » « less
  4. The atmospheric water supply and demand dynamics determine a region’s potential water resources. The hydrologic ratios, such as, aridity index, evaporation ratio and runoff coefficients are useful indicators to quantify the atmospheric water dynamics at watershed to regional scales. In this study, we developed a modeling framework using a machine learning approach to predict hydrologic ratios for watersheds located in contiguous United States (CONUS) by utilizing a set of climate, soil, vegetation, and topographic variables. Overall, the proposed modeling framework is able to simulate the hydrologic ratios at watershed scale with a considerable accuracy. The concept of non-parametric elasticity was applied to study the potential influence of the estimated hydrologic ratios on various drought characteristics (resilience, vulnerability, and exposure) for river basins located in CONUS. Spatial sensitivity of drought indicators to hydrologic ratios suggests that an increase in hydrologic ratios may result in augmentation of magnitude of drought indicators in majority of the river basins. Aridity index seems to have higher influence on drought characteristics in comparison to other hydrologic ratios. It was observed that the machine learning approach based on random forests algorithm can efficiently estimate the spatial distribution of hydrologic ratios provided sufficient data is available. In addition to that, the non-parametric based elasticity approach can identify the potential influence of hydrologic ratios on spatial drought characteristics. 
    more » « less
  5. We study the problem of efficiently estimating the effect of an intervention on a single variable using observational samples. Our goal is to give algorithms with polynomial time and sample complexity in a non-parametric setting. Tian and Pearl (AAAI ’02) have exactly characterized the class of causal graphs for which causal effects of atomic interventions can be identified from observational data. We make their result quantitative. Suppose 𝒫 is a causal model on a set V of n observable variables with respect to a given causal graph G, and let do(x) be an identifiable intervention on a variable X. We show that assuming that G has bounded in-degree and bounded c-components (k) and that the observational distribution satisfies a strong positivity condition: (i) [Evaluation] There is an algorithm that outputs with probability 2/3 an evaluator for a distribution P^ that satisfies TV(P(V | do(x)), P^(V)) < eps using m=O (n/eps^2) samples from P and O(mn) time. The evaluator can return in O(n) time the probability P^(v) for any assignment v to V. (ii) [Sampling] There is an algorithm that outputs with probability 2/3 a sampler for a distribution P^ that satisfies TV(P(V | do(x)), P^(V)) < eps using m=O (n/eps^2) samples from P and O(mn) time. The sampler returns an iid sample from P^ with probability 1 in O(n) time. We extend our techniques to estimate P(Y | do(x)) for a subset Y of variables of interest. We also show lower bounds for the sample complexity, demonstrating that our sample complexity has optimal dependence on the parameters n and eps, as well as if k=1 on the strong positivity parameter. 
    more » « less