Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Zhou, Yu (Ed.)A promising approach for scalable Gaussian processes (GPs) is the Karhunen-Loève (KL) decomposition, in which the GP kernel is represented by a set of basis functions which are the eigenfunctions of the kernel operator. Such decomposed kernels have the potential to be very fast, and do not depend on the selection of a reduced set of inducing points. However KL decompositions lead to high dimensionality, and variable selection thus becomes paramount. This paper reports a new method of forward variable selection, enabled by the ordered nature of the basis functions in the KL expansion of the Bayesian Smoothing Spline ANOVA kernel (BSS-ANOVA), coupled with fast Gibbs sampling in a fully Bayesian approach. It quickly and effectively limits the number of terms, yielding a method with competitive accuracies, training and inference times for tabular datasets of low feature set dimensionality. Theoretical computational complexities are in training and per point in inference, whereNis the number of instances andPthe number of expansion terms. The inference speed and accuracy makes the method especially useful for dynamic systems identification, by modeling the dynamics in the tangent space as a static problem, then integrating the learned dynamics using a high-order scheme. The methods are demonstrated on two dynamic datasets: a ‘Susceptible, Infected, Recovered’ (SIR) toy problem, along with the experimental ‘Cascaded Tanks’ benchmark dataset. Comparisons on the static prediction of time derivatives are made with a random forest (RF), a residual neural network (ResNet), and the Orthogonal Additive Kernel (OAK) inducing points scalable GP, while for the timeseries prediction comparisons are made with LSTM and GRU recurrent neural networks (RNNs) along with the SINDy package.more » « less
-
Abstract The density-functional theory is widely used to predict the physical properties of materials. However, it usually fails for strongly correlated materials. A popular solution is to use the Hubbard correction to treat strongly correlated electronic states. Unfortunately, the values of the HubbardUandJparameters are initially unknown, and they can vary from one material to another. In this semi-empirical study, we explore theUandJparameter space of a group of iron-based compounds to simultaneously improve the prediction of physical properties (volume, magnetic moment, and bandgap). We used a Bayesian calibration assisted by Markov chain Monte Carlo sampling for three different exchange-correlation functionals (LDA, PBE, and PBEsol). We found that LDA requires the largestUcorrection. PBE has the smallest standard deviation and itsUandJparameters are the most transferable to other iron-based compounds. Lastly, PBE predicts lattice parameters reasonably well without the Hubbard correction.more » « less
An official website of the United States government
