skip to main content


Title: SATELLITE DRAG COEFFICIENT MODELING AND ORBIT UNCERTAINTY QUANTIFICATION USING STOCHASTIC MACHINE LEARNING TECHNIQUES
The rapidly increasing congestion in the low Earth environment makes the modeling of uncertainty in atmospheric drag force a critical task, affecting space situational awareness (SSA) activities like the probability of collision estimation. A key element in atmospheric drag modeling is the assessment of uncertainty in the atmospheric drag coefficient estimate. While atmospheric drag coefficients for space objects with known characteristics can be computed numerically, they suffer from large computational costs for practical applications. In this work, we use cost-effective data-driven stochastic methods for modeling the drag coefficients of objects in the low Earth orbit (LEO) region. The training data is generated using the numerical Test Particle Monte Carlo (TPMC) method. TPMC is simulated with Cercignani–Lampis–Lord (CLL) gas-surface interaction (GSI) model. Mehta et al. [1] use a Gaussian process regression (GPR) model to predict satellite drag coefficient, but the authors did not estimate the predictive uncertainty. The first part of this research extends the work by Mehta et al. [1] by fitting a GPR model to the training data and performing predictive uncertainty estimation. The results of the Gaussian fit are then compared against a deep neural network (DNN) model aided by the Monte Carlo dropout approach. To the best of our knowledge, this is the first study to use the aforementioned stochastic deep learning algorithm to perform predictive uncertainty estimation of the estimated satellite drag coefficient. Apart from the accuracy of the models, we also undertake the task of calibrating the models. Simulations are carried out for a spherical satellite followed by the Champ satellite. Finally, quantification of the effect of drag coefficient uncertainty on orbit prediction is carried out for different solar activity and geomagnetic activity levels.  more » « less
Award ID(s):
1726534
NSF-PAR ID:
10315463
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
American Astronomical Society meeting
ISSN:
2152-887X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Machine learning (ML) has been applied to space weather problems with increasing frequency in recent years, driven by an influx of in-situ measurements and a desire to improve modeling and forecasting capabilities throughout the field. Space weather originates from solar perturbations and is comprised of the resulting complex variations they cause within the numerous systems between the Sun and Earth. These systems are often tightly coupled and not well understood. This creates a need for skillful models with knowledge about the confidence of their predictions. One example of such a dynamical system highly impacted by space weather is the thermosphere, the neutral region of Earth’s upper atmosphere. Our inability to forecast it has severe repercussions in the context of satellite drag and computation of probability of collision between two space objects in low Earth orbit (LEO) for decision making in space operations. Even with (assumed) perfect forecast of model drivers, our incomplete knowledge of the system results in often inaccurate thermospheric neutral mass density predictions. Continuing efforts are being made to improve model accuracy, but density models rarely provide estimates of confidence in predictions. In this work, we propose two techniques to develop nonlinear ML regression models to predict thermospheric density while providing robust and reliable uncertainty estimates: Monte Carlo (MC) dropout and direct prediction of the probability distribution, both using the negative logarithm of predictive density (NLPD) loss function. We show the performance capabilities for models trained on both local and global datasets. We show that the NLPD loss provides similar results for both techniques but the direct probability distribution prediction method has a much lower computational cost. For the global model regressed on the Space Environment Technologies High Accuracy Satellite Drag Model (HASDM) density database, we achieve errors of approximately 11% on independent test data with well-calibrated uncertainty estimates. Using an in-situ CHAllenging Minisatellite Payload (CHAMP) density dataset, models developed using both techniques provide test error on the order of 13%. The CHAMP models—on validation and test data—are within 2% of perfect calibration for the twenty prediction intervals tested. We show that this model can also be used to obtain global density predictions with uncertainties at a given epoch.

     
    more » « less
  2. Abstract. Mesoscale dynamics in the mesosphere and lower thermosphere (MLT) region have been difficult to study from either ground- or satellite-based observations. For understanding of atmospheric coupling processes, important spatial scales at these altitudes range between tens and hundreds of kilometers in the horizontal plane. To date, this scale size is challenging observationally, so structures are usually parameterized in global circulation models. The advent of multistatic specular meteor radar networks allows exploration of MLT mesoscale dynamics on these scales using an increased number of detections and a diversity of viewing angles inherent to multistatic networks. In this work, we introduce a four-dimensional wind field inversion method that makes use of Gaussian process regression (GPR), which is a nonparametric and Bayesian approach. The method takes measured projected wind velocities and prior distributions of the wind velocity as a function of space and time, specified by the user or estimated from the data, and produces posterior distributions for the wind velocity. Computation of the predictive posterior distribution is performed on sampled points of interest and is not necessarily regularly sampled. The main benefits of the GPR method include this non-gridded sampling, the built-in statistical uncertainty estimates, and the ability to horizontally resolve winds on relatively small scales. The performance of the GPR implementation has been evaluated on Monte Carlo simulations with known distributions using the same spatial and temporal sampling as 1 d of real meteor measurements. Based on the simulation results we find that the GPR implementation is robust, providing wind fields that are statistically unbiased with statistical variances that depend on the geometry and are proportional to the prior velocity variances. A conservative and fast approach can be straightforwardly implemented by employing overestimated prior variances and distances, while a more robust but computationally intensive approach can be implemented by employing training and fitting of model hyperparameters. The latter GPR approach has been applied to a 24 h dataset and shown to compare well to previously used homogeneous and gradient methods. Small-scale features have reasonably low statistical uncertainties, implying geophysical wind field horizontal structures as low as 20–50 km. We suggest that this GPR approach forms a suitable method for MLT regional and weather studies. 
    more » « less
  3. Abstract

    The space weather research community relies heavily on thermospheric density data to understand long‐term thermospheric variability, construct assimilative, empirical, and semiempirical global atmospheric models and validate model performance. One of the challenges in resolving accurate thermospheric density data sets from satellite orbital drag measurements is modeling appropriate physical aerodynamic drag force coefficients. The drag coefficient may change throughout the thermosphere due to model dependencies on composition and altitude. As such, existing drag coefficient model errors and corresponding errors in orbit‐derived density data sets and models may be altitude and solar cycle dependent with greater errors at higher altitudes around 500 km near the oxygen‐to‐helium transition region. In this paper, inter‐satellite observed‐to‐modeled density comparisons at ∼500 km are evaluated to constrain drag coefficient modeling assumptions. Observed densities are derived from accelerometer data for the Gravity Recovery and Climate Experiment (GRACE) satellites and Two‐Line Element data for a set of compact satellites, while the NRLMSISE‐00 atmospheric model is used to obtain modeled densities and composition information. Density consistency results indicate that drag coefficient models with incomplete energy and momentum accommodation produce the most consistent densities, while the standard diffuse modeling approach may not be appropriate at these altitudes. Models with momentum accommodation between 0.5 and 0.9 and energy accommodation between 0.83 and 0.96 may be most appropriate at upper thermospheric altitudes. Modeling drag coefficients with diffuse gas‐surface interactions for the GRACE satellites could lead to errors in derived density of ∼25% and in‐track satellite orbit prediction uncertainty during solar maximum conditions on the order of kilometers.

     
    more » « less
  4. Following an earthquake, ground motion time series are needed to carry out site-specific nonlinear response history analysis. However, the number of currently available recording instruments is sparse; thus, the ground motion time series at uninstrumented sites must be estimated. Tamhidi et al. developed a Gaussian process regression (GPR) model to generate ground motion time series given a set of recorded ground motions surrounding the target site. This GPR model interpolates the observed ground motions’ Fourier Transform coefficients to generate the target site’s Fourier spectrum and the corresponding time series. The robustness of the optimized hyperparameter of the model depends on the surrounding observation density. In this study, we carried out sensitivity analysis and tuned the hyperparameter of the GPR model for various observation densities. The 2019 M7.1 Ridgecrest and 2020 M4.5 South El Monte earthquake data sets recorded by the Community Seismic Network and California Integrated Seismic Network in Southern California are used to demonstrate the process. To provide a tool to quantify the uncertainty of the generated motions, a methodology to develop realizations of ground motion time series is also incorporated. The results illustrate that the uncertainty of the generated motions is lower at longer periods. It is shown that the observation density in the proximity of the target site plays a vital role in both error and uncertainty reduction of the generated time series. To demonstrate the concept, the effect of additional observations from combined recording networks is investigated. 
    more » « less
  5. Abstract

    Recently, recurrent deep networks have shown promise to harness newly available satellite‐sensed data for long‐term soil moisture projections. However, to be useful in forecasting, deep networks must also provide uncertainty estimates. Here we evaluated Monte Carlo dropout with an input‐dependent data noise term (MCD+N), an efficient uncertainty estimation framework originally developed in computer vision, for hydrologic time series predictions. MCD+N simultaneously estimates a heteroscedastic input‐dependent data noise term (a trained error model attributable to observational noise) and a network weight uncertainty term (attributable to insufficiently constrained model parameters). Although MCD+N has appealing features, many heuristic approximations were employed during its derivation, and rigorous evaluations and evidence of its asserted capability to detect dissimilarity were lacking. To address this, we provided an in‐depth evaluation of the scheme's potential and limitations. We showed that for reproducing soil moisture dynamics recorded by the Soil Moisture Active Passive (SMAP) mission, MCD+N indeed gave a good estimate of predictive error, provided that we tuned a hyperparameter and used a representative training data set. The input‐dependent term responded strongly to observational noise, while the model term clearly acted as a detector for physiographic dissimilarity from the training data, behaving as intended. However, when the training and test data were characteristically different, the input‐dependent term could be misled, undermining its reliability. Additionally, due to the data‐driven nature of the model, data noise also influences network weight uncertainty, and therefore the two uncertainty terms are correlated. Overall, this approach has promise, but care is needed to interpret the results.

     
    more » « less