skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Seeking stability by being lazy and shallow: lazy and shallow instantiation is user friendly
Designing a language feature often requires a choice between several, similarly expressive possibilities. Given that user studies are generally impractical, we propose using stability as a way of making such decisions. Stability is a measure of whether the meaning of a program alters under small, seemingly innocuous changes in the code (e.g., inlining). Directly motivated by a need to pin down a feature in GHC/Haskell, we apply this notion of stability to analyse four approaches to the instantiation of polymorphic types, concluding that the most stable approach is lazy (instantiate a polytype only when absolutely necessary) and shallow (instantiate only top-level type variables, not variables that appear after explicit arguments).  more » « less
Award ID(s):
1704041
PAR ID:
10299114
Author(s) / Creator(s):
;
Editor(s):
Hage, Jurriaan
Date Published:
Journal Name:
Haskell 2021: Proceedings of the 14th ACM SIGPLAN International Symposium on Haskell
Page Range / eLocation ID:
85 to 97
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract We develop a generalized interpolation material point method (GIMPM) for the shallow shelf approximation (SSA) of ice flow. The GIMPM, which can be viewed as a particle version of the finite element method, is used here to solve the shallow shelf approximations of the momentum balance and ice thickness evolution equations. We introduce novel numerical schemes for particle splitting and integration at domain boundaries to accurately simulate the spreading of an ice shelf. The advantages of the proposed GIMPM‐SSA framework include efficient advection of history or internal state variables without diffusion errors, automated tracking of the ice front and grounding line at sub‐element scales, and a weak formulation based on well‐established conventions of the finite element method with minimal additional computational cost. We demonstrate the numerical accuracy and stability of the GIMPM using 1‐D and 2‐D benchmark examples. We also compare the accuracy of the GIMPM with the standard material point method (sMPM) and a reweighted form of the sMPM. We find that the grid‐crossing error is very severe for SSA simulations with the sMPM, whereas the GIMPM successfully mitigates this error. While the grid‐crossing error can be reasonably reduced in the sMPM by implementing a simple material point reweighting scheme, this approach it not as accurate as the GIMPM. Thus, we illustrate that the GIMPM‐SSA framework is viable for the simulation of ice sheet‐shelf evolution and enables boundary tracking and error‐free advection of history or state variables, such as ice thickness or damage. 
    more » « less
  2. As opaque predictive models increasingly impact many areas of modern life, interest in quantifying the importance of a given input variable for making a specific prediction has grown. Recently, there has been a proliferation of model-agnostic methods to measure variable importance (VI) that analyze the difference in predictive power between a full model trained on all variables and a reduced model that excludes the variable(s) of interest. A bottleneck common to these methods is the estimation of the reduced model for each variable (or subset of variables), which is an expensive process that often does not come with theoretical guarantees. In this work, we propose a fast and flexible method for approximating the reduced model with important inferential guarantees. We replace the need for fully retraining a wide neural network by a linearization initialized at the full model parameters. By adding a ridge-like penalty to make the problem convex, we prove that when the ridge penalty parameter is sufficiently large, our method estimates the variable importance measure with an error rate of O(1/n) where n is the number of training samples. We also show that our estimator is asymptotically normal, enabling us to provide confidence bounds for the VI estimates. We demonstrate through simulations that our method is fast and accurate under several data-generating regimes, and we demonstrate its real-world applicability on a seasonal climate forecasting example. 
    more » « less
  3. We focus on developing a theoretical understanding of meta-learning. Given multiple tasks drawn i.i.d. from some (unknown) task distribution, the goal is to find a good pre-trained model that can be adapted to a new, previously unseen, task with little computational and statistical overhead. We introduce a novel notion of stability for meta-learning algorithms, namely uniform meta-stability. We instantiate two uniformly meta-stable learning algorithms based on regularized empirical risk minimization and gradient descent and give explicit generalization bounds for convex learning problems with smooth losses and for weakly convex learning problems with non-smooth losses. Finally, we extend our results to stochastic and adversarially robust variants of our meta-learning algorithm. 
    more » « less
  4. The objective of this study is to develop data-driven predictive models for peak rotation and factor of safety for tipping-over failure of rocking shallow foundations during earthquake loading using multiple nonlinear machine learning (ML) algorithms and a supervised learning technique. Centrifuge and shaking table experimental results on rocking foundations have been used for the development of k-nearest neighbors regression (KNN), support vector regression (SVR), and random forest regression (RFR) models. The input features to ML models include critical contact area ratio of foundation; slenderness ratio and rocking coefficient of rocking system; peak ground acceleration and Arias intensity of earthquake motion; and a categorical binary feature that separates sandy soil foundations from clayey soil foundations. Based on repeated k-fold cross validation tests of models, we found that the overall average mean absolute percentage errors (MAPE) in predictions of all three nonlinear ML models varied between 0.46 and 0.60, outperforming a baseline multivariate linear regression ML model with corresponding MAPE of 0.68 to 0.75. The input feature importance analysis reveals that the peak rotation and tipping-over stability of rocking foundations are more sensitive to ground motion demand parameters than to rocking foundation capacity parameters or type of soil. 
    more » « less
  5. Abstract In this study, we carry out robust optimal design for the machining operations, one key process in wafer polishing in chip manufacturing, aiming to avoid the peculiar regenerative chatter and maximize the material removal rate (MRR) considering the inherent material and process uncertainty. More specifically, we characterize the cutting tool dynamics using a delay differential equation (DDE) and enlist the temporal finite element method (TFEM) to derive its approximate solution and stability index given process settings or design variables. To further quantify the inherent uncertainty, replications of TFEM under different realizations of random uncontrollable variables are performed, which however incurs extra computational burden. To eschew the deployment of such a crude Monte Carlo (MC) approach at each design setting, we integrate the stochastic TFEM with a stochastic surrogate model, stochastic kriging, in an active learning framework to sequentially approximate the stability boundary. The numerical result suggests that the nominal stability boundary attained from this method is on par with that from the crude MC, but only demands a fraction of the computational overhead. To further ensure the robustness of process stability, we adopt another surrogate, the Gaussian process, to predict the variance of the stability index at unexplored design points and identify the robust stability boundary per the conditional value at risk (CVaR) criterion. Therefrom, an optimal design in the robust stable region that maximizes the MRR can be identified. 
    more » « less