skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: On boundedness of singularities and minimal log discrepancies of Kollár components
Recent study in K-stability suggests that Kawamata log terminal (klt) singularities whose local volumes are bounded away from zero should be bounded up to special degeneration. We show that this is true in dimension three, or when the minimal log discrepancies of Kollár components are bounded from above. We conjecture that the minimal log discrepancies of Kollár components are always bounded from above, and verify it in dimension three when the local volumes are bounded away from zero. We also answer a question from Han, Liu, and Qi on the relation between log canonical thresholds and local volumes.  more » « less
Award ID(s):
2240926 2234736
PAR ID:
10497727
Author(s) / Creator(s):
Publisher / Repository:
University Press, Inc.
Date Published:
Journal Name:
Journal of Algebraic Geometry
ISSN:
1056-3911
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The ascending chain condition (ACC) conjecture for local volumes predicts that the set of local volumes of Kawamata log terminal (klt) singularities x ∈ ( X , Δ ) x\in (X,\Delta ) satisfies the ACC if the coefficients of Δ \Delta belong to a descending chain condition (DCC) set. In this paper, we prove the ACC conjecture for local volumes under the assumption that the ambient germ is analytically bounded. We introduce another related conjecture, which predicts the existence of δ \delta -plt blow-ups of a klt singularity whose local volume has a positive lower bound. We show that the latter conjecture also holds when the ambient germ is analytically bounded. Moreover, we prove that both conjectures hold in dimension 2 as well as for 3-dimensional terminal singularities. 
    more » « less
  2. Abstract We prove that every irreducible component of the coarse Kollár-Shepherd-Barron and Alexeev (KSBA) moduli space of stable log Calabi–Yau surfaces admits a finite cover by a projective toric variety. This verifies a conjecture of Hacking–Keel–Yu. The proof combines tools from log smooth deformation theory, the minimal model program, punctured log Gromov–Witten theory, and mirror symmetry. 
    more » « less
  3. We consider the Ising perceptron model with N spins and M = N*alpha patterns, with a general activation function U that is bounded above. For U bounded away from zero, or U a one-sided threshold function, it was shown by Talagrand (2000, 2011) that for small densities alpha, the free energy of the model converges in the large-N limit to the replica symmetric formula conjectured in the physics literature (Krauth–Mezard 1989, see also Gardner–Derrida 1988). We give a new proof of this result, which covers the more general class of all functions U that are bounded above and satisfy a certain variance bound. The proof uses the (first and second) moment method conditional on the approximate message passing iterates of the model. In order to deduce our main theorem, we also prove a new concentration result for the perceptron model in the case where U is not bounded away from zero. 
    more » « less
  4. A loss function measures the discrepancy between the true values (observations) and their estimated fits, for a given instance of data. A loss function is said to be proper (unbiased, Fisher consistent) if the fits are defined over a unit simplex, and the minimizer of the expected loss is the true underlying probability of the data. Typical examples are the zero-one loss, the quadratic loss and the Bernoulli log-likelihood loss (log-loss). In this work we show that for binary classification problems, the divergence associated with smooth, proper and convex loss functions is bounded from above by the Kullback-Leibler (KL) divergence, up to a multiplicative normalization constant. It implies that by minimizing the log-loss (associated with the KL divergence), we minimize an upper bound to any choice of loss functions from this set. This property justifies the broad use of log-loss in regression, decision trees, deep neural networks and many other applications. In addition, we show that the KL divergence bounds from above any separable Bregman divergence that is convex in its second argument (up to a multiplicative normalization constant). This result introduces a new set of divergence inequalities, similar to the well-known Pinsker inequality. 
    more » « less
  5. The analytic aspects of multiplier ideals, log canonical thresholds and log canonical centers played an important role in several papers of Demailly, including [DEL00, Dem01, DK01, Dem12, DP14, Dem16, CDM17, Dem18]. Log canonical centers are seminormal by [Amb03, Fuj17], even Du Bois by [KK10, KK20]. This has important applications to birational geometry and moduli theory; see [KK10, KK20] or [Kol23, Sec.2.5]. We recall the concept of Du Bois singularities in Definition-Theorem 4. An unusual aspect is that this notion makes sense for complex spaces that have irreducible components of different dimension. This is crucial even for the statement of our theorem. In this note we generalize the results of [KK10, KK20] by showing that if a closed subset is 'close enough' to being a union of log canonical centers, then it is Du Bois. The minimal discrepancy is a nonnegative rational number, that measures the deviation from being a union of log canonical centers. The log canonical gap gives the precise notion of 'closeness.' 
    more » « less