skip to main content


Search for: All records

Award ID contains: 1835443

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Polynomial nonnegativity constraints can often be handled using thesum of squarescondition. This can be efficiently enforced using semidefinite programming formulations, or as more recently proposed by Papp and Yildiz (Papp D in SIAM J O 29: 822–851, 2019), using the sum of squares cone directly in an interior point algorithm. Beyond nonnegativity, more complicated polynomial constraints (in particular, generalizations of the positive semidefinite, second order and$$\ell _1$$1-norm cones) can also be modeled through structured sum of squares programs. We take a different approach and propose using more specialized cones instead. This can result in lower dimensional formulations, more efficient oracles for interior point methods, or self-concordant barriers with smaller parameters.

     
    more » « less
  2. Free, publicly-accessible full text available February 1, 2025
  3. We study the problem of decomposing a polynomial p into a sum of r squares by minimizing a quadratically penalized objective fp(u)=‖‖∑ri=1u2i−p‖‖2. This objective is nonconvex and is equivalent to the rank-r Burer-Monteiro factorization of a semidefinite program (SDP) encoding the sum of squares decomposition. We show that for all univariate polynomials p, if r≥2 then fp(u) has no spurious second-order critical points, showing that all local optima are also global optima. This is in contrast to previous work showing that for general SDPs, in addition to genericity conditions, r has to be roughly the square root of the number of constraints (the degree of p) for there to be no spurious second-order critical points. Our proof uses tools from computational algebraic geometry and can be interpreted as constructing a certificate using the first- and second-order necessary conditions. We also show that by choosing a norm based on sampling equally-spaced points on the circle, the gradient ∇fp can be computed in nearly linear time using fast Fourier transforms. Experimentally we demonstrate that this method has very fast convergence using first-order optimization algorithms such as L-BFGS, with near-linear scaling to million-degree polynomials. 
    more » « less
    Free, publicly-accessible full text available September 30, 2024
  4. JuMP is an algebraic modeling language embedded in the Julia programming language. JuMP allows users to model optimization problems of a variety of kinds, including linear programming, integer programming, conic optimization, semidefinite programming, and nonlinear programming, and handles the low-level details of communicating with solvers. After nearly 10 years in development, JuMP 1.0 was released in March, 2022. In this short communication, we highlight the improvements to JuMP from recent releases up to and including 1.0. 
    more » « less
    Free, publicly-accessible full text available September 1, 2024
  5. Free, publicly-accessible full text available May 31, 2024
  6. Spectral functions on Euclidean Jordan algebras arise frequently in convex optimization models. Despite the success of primal-dual conic interior point solvers, there has been little work on enabling direct support for spectral cones, that is, proper nonsymmetric cones defined from epigraphs and perspectives of spectral functions. We propose simple logarithmically homogeneous barriers for spectral cones and we derive efficient, numerically stable procedures for evaluating barrier oracles such as inverse Hessian operators. For two useful classes of spectral cones—the root-determinant cones and the matrix monotone derivative cones—we show that the barriers are self-concordant, with nearly optimal parameters. We implement these cones and oracles in our open-source solver Hypatia, and we write simple, natural formulations for four applied problems. Our computational benchmarks demonstrate that Hypatia often solves the natural formulations more efficiently than advanced solvers such as MOSEK 9 solve equivalent extended formulations written using only the cones these solvers support. Funding: This work was supported by Office of Naval Research [Grant N00014-18-1-2079] and the National Science Foundation [Grant OAC-1835443]. 
    more » « less
  7. Automatic differentiation (AD), a technique for constructing new programs which compute the derivative of an original program, has become ubiquitous throughout scientific computing and deep learning due to the improved performance afforded by gradient-based optimization. However, AD systems have been restricted to the subset of programs that have a continuous dependence on parameters. Programs that have discrete stochastic behaviors governed by distribution parameters, such as flipping a coin with probability p of being heads, pose a challenge to these systems because the connection between the result (heads vs tails) and the parameters ( p ) is fundamentally discrete. In this paper we develop a new reparameterization-based methodology that allows for generating programs whose expectation is the derivative of the expectation of the original program. We showcase how this method gives an unbiased and low-variance estimator which is as automated as traditional AD mechanisms. We demonstrate unbiased forward-mode AD of discrete-time Markov chains, agent-based models such as Conway's Game of Life, and unbiased reverse-mode AD of a particle filter. Our code package is available at https://github.com/gaurav-arya/StochasticAD.jl. 
    more » « less