skip to main content


Title: Thermosphere modeling capabilities assessment: geomagnetic storms
The specification and prediction of density fluctuations in the thermosphere, especially during geomagnetic storms, is a key challenge for space weather observations and modeling. It is of great operational importance for tracking objects orbiting in near-Earth space. For low-Earth orbit, variations in neutral density represent the most important uncertainty for propagation and prediction of satellite orbits. An international conference in 2018 conducted under the auspices of the NASA Community Coordinated Modeling Center (CCMC) included a workshop on neutral density modeling, using both empirical and numerical methods, and resulted in the organization of an initial effort of model comparison and evaluation. Here, we present an updated metric for model assessment under geomagnetic storm conditions by dividing a storm in four phases with respect to the time of minimum Dst and then calculating the mean density ratios and standard deviations and correlations. Comparisons between three empirical (NRLMSISE-00, JB2008 and DTM2013) and two first-principles models (TIE-GCM and CTIPe) and neutral density data sets that include measurements by the CHAMP, GRACE, and GOCE satellites for 13 storms are presented. The models all show reduced performance during storms, notably much increased standard deviations, but DTM2013, JB2008 and CTIPe did not on average reveal a significant bias in the four phases of our metric. DTM2013 and TIE-GCM driven with the Weimer model achieved the best results taking the entire storm event into account, while NRLMSISE-00 systematically and significantly underestimates the storm densities. Numerical models are still catching up to empirical methods on a statistical basis, but as their drivers become more accurate and they become available at higher resolutions, they will surpass them in the foreseeable future.  more » « less
Award ID(s):
1651459
NSF-PAR ID:
10229766
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Journal of Space Weather and Space Climate
Volume:
11
ISSN:
2115-7251
Page Range / eLocation ID:
12
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Prediction of ionospheric state is a critical space weather problem. We expand on our previous research of medium‐range ionospheric forecasts and present new results on evaluating prediction capabilities of three physics‐based ionosphere‐thermosphere models (Thermosphere Ionosphere Electrodynamics General Circulation Model, TIE‐GCM; Coupled Thermosphere Ionosphere Plasmasphere Electrodynamics Model, CTIPe; and Global Ionosphere Thermosphere Model, GITM). The focus of our study is understanding how current modeling approaches may predict the global ionosphere for geomagnetic storms (as studied through 35 storms during 2000–2016). Prediction approach uses physics‐based modeling without any manual model adjustment, quality control, or selection of the results. Our goal is to understand to what extent current physics‐based modeling can be used in total electron content (TEC) prediction and explore uncertainties of these prediction efforts with multiday lead times. The ionosphere‐thermosphere model runs are driven by actual interplanetary conditions, whether those data come from real‐time measurements or predicted values themselves. These model runs were performed by the Community Coordinated Modeling Center (CCMC). Jet Propulsion Laboratory (JPL)‐produced global ionospheric maps (GIMs) were used to validate model TEC estimates. We utilize the True Skill Statistic (TSS) metric for the TEC prediction evaluation, noting that this is but one metric to assess predictive skill and that complete evaluations require combinations of such metrics. The meanings of contingency table elements for the prediction performance are analyzed in the context of ionosphere modeling. Prediction success is between about 0.2 and 0.5 for weak ionospheric disturbances and decreases for strong disturbances. We evaluate the prediction of TEC decreases and increases. Our results indicate that physics‐based modeling during storms shows promise in TEC prediction with multiday lead time.

     
    more » « less
  2. To improve Thermosphere–Ionosphere modeling during disturbed conditions, data assimilation schemes that can account for the large and fast-moving gradients moving through the modeled domain are necessary. We argue that this requires a physics based background model with a non-stationary covariance. An added benefit of using physics-based models would be improved forecasting capability over largely persistence-based forecasts of empirical models. As a reference implementation, we have developed an ensemble Kalman Filter (enKF) software called Thermosphere Ionosphere Data Assimilation (TIDA) using the physics-based Coupled Thermosphere Ionosphere Plasmasphere electrodynamics (CTIPe) model as the background. In this paper, we present detailed results from experiments during the 2003 Halloween Storm, 27–31 October 2003, under very disturbed ( K p  = 9) conditions while assimilating GRACE-A and B, and CHAMP neutral density measurements. TIDA simulates this disturbed period without using the L1 solar wind measurements, which were contaminated by solar energetic protons, by estimating the model drivers from the density measurements. We also briefly present statistical results for two additional storms: September 27 – October 2, 2002, and July 26 – 30, 2004, to show that the improvement in assimilated neutral density specification is not an artifact of the corrupted forcing observations during the 2003 Halloween Storm. By showing statistical results from assimilating one satellite at a time, we show that TIDA produces a coherent global specification for neutral density throughout the storm – a critical capability in calculating satellite drag and debris collision avoidance for space traffic management. 
    more » « less
  3. Abstract

    The geospace environment is volatile and highly driven. Space weather has effects on Earth's magnetosphere that cause a dynamic and enigmatic response in the thermosphere, particularly on the evolution of neutral mass density. Many models exist that use space weather drivers to produce a density response, but these models are typically computationally expensive or inaccurate for certain space weather conditions. In response, this work aims to employ a probabilistic machine learning (ML) method to create an efficient surrogate for the Thermosphere Ionosphere Electrodynamics General Circulation Model (TIE‐GCM), a physics‐based thermosphere model. Our method leverages principal component analysis to reduce the dimensionality of TIE‐GCM and recurrent neural networks to model the dynamic behavior of the thermosphere much quicker than the numerical model. The newly developed reduced order probabilistic emulator (ROPE) uses Long‐Short Term Memory neural networks to perform time‐series forecasting in the reduced state and provide distributions for future density. We show that across the available data, TIE‐GCM ROPE has similar error to previous linear approaches while improving storm‐time modeling. We also conduct a satellite propagation study for the significant November 2003 storm which shows that TIE‐GCM ROPE can capture the position resulting from TIE‐GCM density with <5 km bias. Simultaneously, linear approaches provide point estimates that can result in biases of 7–18 km.

     
    more » « less
  4. During geomagnetic storms a large amount of energy is transferred into the ionosphere-thermosphere (IT) system, leading to local and global changes in e.g., the dynamics, composition, and neutral density. The more steady energy from the lower atmosphere into the IT system is in general much smaller than the energy input from the magnetosphere, especially during geomagnetic storms, and therefore details of the lower atmosphere forcing are often neglected in storm time simulations. In this study we compare the neutral density observed by Swarm-C during the moderate geomagnetic storm of 31 January to 3 February 2016 with the Thermosphere-Ionosphere-Electrodynamics-GCM (TIEGCM) finding that the model can capture the observed large scale neutral density variations better in the southern than northern hemisphere. The importance of more realistic lower atmospheric (LB) variations as specified by the Whole Atmosphere Community Climate Model eXtended (WACCM-X) with specified dynamics (SD) is demonstrated by improving especially the northern hemisphere neutral density by up to 15% compared to using climatological LB forcing. Further analysis highlights the importance of the background atmospheric condition in facilitating hemispheric different neutral density changes in response to the LB perturbations. In comparison, employing observationally based field-aligned current (FAC) versus using an empirical model to describe magnetosphere-ionosphere (MI) coupling leads to an 7–20% improved northern hemisphere neutral density. The results highlight the importance of the lower atmospheric variations and high latitude forcing in simulating the absolute large scale neutral density especially the hemispheric differences. However, focusing on the storm time variation with respect to the quiescent time, the lower atmospheric influence is reduced to 1–1.5% improvement with respect to the total observed neutral density. The results provide some guidance on the importance of more realistic upper boundary forcing and lower atmospheric variations when modeling large scale, absolute and relative neutral density variations. 
    more » « less
  5. Abstract

    NRLMSIS® 2.0 is an empirical atmospheric model that extends from the ground to the exobase and describes the average observed behavior of temperature, eight species densities, and mass density via a parametric analytic formulation. The model inputs are location, day of year, time of day, solar activity, and geomagnetic activity. NRLMSIS 2.0 is a major, reformulated upgrade of the previous version, NRLMSISE‐00. The model now couples thermospheric species densities to the entire column, via an effective mass profile that transitions each species from the fully mixed region below ~70 km altitude to the diffusively separated region above ~200 km. Other changes include the extension of atomic oxygen down to 50 km and the use of geopotential height as the internal vertical coordinate. We assimilated extensive new lower and middle atmosphere temperature, O, and H data, along with global average thermospheric mass density derived from satellite orbits, and we validated the model against independent samples of these data. In the mesosphere and below, residual biases and standard deviations are considerably lower than NRLMSISE‐00. The new model is warmer in the upper troposphere and cooler in the stratosphere and mesosphere. In the thermosphere, N2and O densities are lower in NRLMSIS 2.0; otherwise, the NRLMSISE‐00 thermosphere is largely retained. Future advances in thermospheric specification will likely require new in situ mass spectrometer measurements, new techniques for species density measurement between 100 and 200 km, and the reconciliation of systematic biases among thermospheric temperature and composition data sets, including biases attributable to long‐term changes.

     
    more » « less