skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Deep Learning for Climate Models of the Atlantic Ocean
A deep neural network is trained to predict sea surface temperature variations at two important regions of the Atlantic ocean, using 800 years of simulated climate dynamics based on the first-principles physics models. This model is then tested against 60 years of historical data. Our statistical model learns to approximate the physical laws governing the simulation, providing significant improvement over simple statistical forecasts and comparable to most state-of-the-art dynamical/conventional forecast models for a fraction of the computational cost.  more » « less
Award ID(s):
1920304
PAR ID:
10273992
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
AAAI Spring Symposium: MLPS, 2020
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. As a promising approach to deal with distributed data, Federated Learning (FL) achieves major advancements in recent years. FL enables collaborative model training by exploiting the raw data dispersed in multiple edge devices. However, the data is generally non-independent and identically distributed, i.e., statistical heterogeneity, and the edge devices significantly differ in terms of both computation and communication capacity, i.e., system heterogeneity. The statistical heterogeneity leads to severe accuracy degradation while the system heterogeneity significantly prolongs the training process. In order to address the heterogeneity issue, we propose an Asynchronous Staleness-aware Model Update FL framework, i.e., FedASMU, with two novel methods. First, we propose an asynchronous FL system model with a dynamical model aggregation method between updated local models and the global model on the server for superior accuracy and high efficiency. Then, we propose an adaptive local model adjustment method by aggregating the fresh global model with local models on devices to further improve the accuracy. Extensive experimentation with 6 models and 5 public datasets demonstrates that FedASMU significantly outperforms baseline approaches in terms of accuracy (0.60% to 23.90% higher) and efficiency (3.54% to 97.98% faster). 
    more » « less
  2. The standard model of cosmology has provided a good phenomenological description of a wide range of observations both at astrophysical and cosmological scales for several decades. This concordance model is constructed by a universal cosmological constant and supported by a matter sector described by the standard model of particle physics and a cold dark matter contribution, as well as very early-time inflationary physics, and underpinned by gravitation through general relativity. There have always been open questions about the soundness of the foundations of the standard model. However, recent years have shown that there may also be questions from the observational sector with the emergence of differences between certain cosmological probes. In this White Paper, we identify the key objectives that need to be addressed over the coming decade together with the core science projects that aim to meet these challenges. These discordances primarily rest on the divergence in the measurement of core cosmological parameters with varying levels of statistical confidence. These possible statistical tensions may be partially accounted for by systematics in various measurements or cosmological probes but there is also a growing indication of potential new physics beyond the standard model. After reviewing the principal probes used in the measurement of cosmological parameters, as well as potential systematics, we discuss the most promising array of potential new physics that may be observable in upcoming surveys. We also discuss the growing set of novel data analysis approaches that go beyond traditional methods to test physical models. These new methods will become increasingly important in the coming years as the volume of survey data continues to increase, and as the degeneracy between predictions of different physical models grows. There are several perspectives on the divergences between the values of cosmological parameters, such as the model-independent probes in the late Universe and model-dependent measurements in the early Universe, which we cover at length. The White Paper closes with a number of recommendations for the community to focus on for the upcoming decade of observational cosmology, statistical data analysis, and fundamental physics developments 
    more » « less
  3. Abstract Fluctuations in the path of the Gulf Stream (GS) have been previously studied by primarily connecting to either the wind‐driven subtropical gyre circulation or buoyancy forcing via the subpolar gyre. Here we present a statistical model for 1 year predictions of the GS path (represented by the GS northern wall—GSNW) betweenW andW incorporating both mechanisms in a combined framework. An existing model with multiple parameters including the previous year's GSNW index, center location, and amplitude of the Icelandic Low and the Southern Oscillation Index was augmented with basin‐wide Ekman drift over the Azores High. The addition of the wind is supported by a validation of the simpler two‐layer Parsons‐Veronis model of GS separation over the last 40 years. A multivariate analysis was carried out to compare 1‐year‐in‐advance forecast correlations from four different models. The optimal predictors of the best performing model include: (a) the GSNW index from the previous year, (b) gyre‐scale integrated Ekman Drift over the past 2 years, and (c) longitude of the Icelandic Low center lagged by 3 years. The forecast correlation over the 27 years (1994–2020) is 0.65, an improvement from the previous multi‐parameter model's forecast correlation of 0.52. The improvement is attributed to the addition of the wind‐drift component. The sensitivity of forecasting the GS path after extreme atmospheric years is quantified. Results indicate the possibility of better understanding and enhanced predictability of the dominant wind‐driven variability of the Atlantic Meridional Overturning Circulation and of fisheries management models that use the GS path as a metric. 
    more » « less
  4. Ruis, Andrew R.; Lee, Seung B. (Ed.)
    A key goal of quantitative ethnographic (QE) models, and statistical models more generally, is to produce the most parsimonious model that adequately explains or predicts the phenomenon of interest. In epistemic network analysis (ENA), for example, this entails constructing network models with the fewest number of codes whose interaction structure provides sufficient explanatory power in a given context. Unlike most statistical models, however, modification of ENA models can affect not only the statistical properties but also the interpretive alignment between quantitative features and qualitative meaning that is a central goal in QE analyses. In this study, we propose a novel method, Parsimonious Removal with Interpretive Alignment, for systematically identifying more parsimonious ENA models that are likely to maintain interpretive alignment with an existing model. To test the efficacy of the method, we implemented it on a well-studied dataset for which there is a published, validated ENA model, and we show that the method successfully identifies reduced models likely to maintain explanatory power and interpretive alignment. 
    more » « less
  5. Goal 1 of the 2030 Agenda for Sustainable Development, adopted by all United Nations member States in 2015, is to end poverty in all forms everywhere. The major indicator to monitor the goal is the so-called headcount ratio or poverty rate, i.e., proportion or percentage of people under poverty. In India, where nearly a quarter of population still live below the poverty line, monitoring of poverty needs greater attention, more frequently at shorter intervals (e.g., every year) to evaluate the effectiveness of planning, programs and actions taken by the governments to eradicate poverty. Poverty rate computation for India depends on two basic ingredients – rural and urban poverty lines for different states and union territories and average Monthly Per-capita Consumer Expenditure (MPCE). While MPCE can be obtained every year, usually from the Consumer Expenditure Survey on shorter schedules with a few exceptions where the information is obtained from another survey, determination of poverty lines is a highly complex, costly and time-consuming process. Poverty lines are essentially determined by a panel of experts who draws their conclusions partly based on their subjective opinions and partly based on data from multiple sources. The main data source the panel uses is the Consumer Expenditure Survey data with a detailed schedule, which are usually available every five years or so. In this paper, we undertake a feasibility study to explore if estimates of headcount ratios or Poverty Ratios in intervening years can be provided in absence of poverty lines by relating poverty ratios with average MPCE through a statistical model. Then we can use the fitted model to predict poverty rates for intervening years based on average MPCE. We explore a few in this work models using Bayesian methodology. The reason behind calling this ‘synthetic prediction’ rests on the synthetic assumption of model invariance over years, often used in the small area literature. While the data-based assessment of our Bayesian synthetic prediction procedure is encouraging, there is a great potential for improvements on the models presented in this paper, e.g., by incorporating more auxiliary data as they become available. In any case, we expect our preliminary work in this important area will encourage researchers to think about statistical modeling as a possible way to at least partially solve a problem for which no objective solution is currently available. 
    more » « less