skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Random trigonometric polynomials: Universality and non-universality of the variance for the number of real roots
Award ID(s):
1752345
PAR ID:
10413724
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Annales de l'Institut Henri Poincaré, Probabilités et Statistiques
Volume:
58
Issue:
3
ISSN:
0246-0203
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. A loss function measures the discrepancy between the true values (observations) and their estimated fits, for a given instance of data. A loss function is said to be proper (unbiased, Fisher consistent) if the fits are defined over a unit simplex, and the minimizer of the expected loss is the true underlying probability of the data. Typical examples are the zero-one loss, the quadratic loss and the Bernoulli log-likelihood loss (log-loss). In this work we show that for binary classification problems, the divergence associated with smooth, proper and convex loss functions is bounded from above by the Kullback-Leibler (KL) divergence, up to a multiplicative normalization constant. It implies that by minimizing the log-loss (associated with the KL divergence), we minimize an upper bound to any choice of loss functions from this set. This property justifies the broad use of log-loss in regression, decision trees, deep neural networks and many other applications. In addition, we show that the KL divergence bounds from above any separable Bregman divergence that is convex in its second argument (up to a multiplicative normalization constant). This result introduces a new set of divergence inequalities, similar to the well-known Pinsker inequality. 
    more » « less
  2. Abstract We study the universality of superconcentration for the free energy in the Sherrington–Kirkpatrick model. In [10], Chatterjee showed that when the system consists of spins and Gaussian disorders, the variance of this quantity is superconcentrated by establishing an upper bound of order , in contrast to the bound obtained from the Gaussian–Poincaré inequality. In this paper, we show that superconcentration indeed holds for any choice of centered disorders with finite third moment, where the upper bound is expressed in terms of an auxiliary nondecreasing function that arises in the representation of the disorder as for standard normal. Under an additional regularity assumption on , we further show that the variance is of order at most . 
    more » « less
  3. null (Ed.)