skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Superconvexity of the heat kernel on hyperbolic space with applications to mean curvature flow
We prove a conjecture of Bernstein that the heat kernel on hyperbolic space of any dimension is supercovex in a suitable coordinate and, hence, there is an analog of Huisken’s monotonicity formula for mean curvature flow in hyperbolic space of all dimensions.  more » « less
Award ID(s):
2018220 2018221
PAR ID:
10292336
Author(s) / Creator(s):
Date Published:
Journal Name:
Proceedings of the American Mathematical Society
Volume:
149
Issue:
743
ISSN:
0002-9939
Page Range / eLocation ID:
2161 to 2166
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Hyperbolic neural networks have been popular in the re- cent past due to their ability to represent hierarchical data sets effectively and efficiently. The challenge in develop- ing these networks lies in the nonlinearity of the embed- ding space namely, the Hyperbolic space. Hyperbolic space is a homogeneous Riemannian manifold of the Lorentz group which is a semi-Riemannian manifold, i.e. a mani- fold equipped with an indefinite metric. Most existing meth- ods (with some exceptions) use local linearization to de- fine a variety of operations paralleling those used in tra- ditional deep neural networks in Euclidean spaces. In this paper, we present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) fol- lowed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space. The novelty here lies in the projection which is designed to project data on to a lower- dimensional embedded hyperbolic space and hence leads to a nested hyperbolic space representation independently useful for dimensionality reduction. The main theoretical contribution is that the proposed embedding is proved to be isometric and equivariant under the Lorentz transforma- tions, which are the natural isometric transformations in hyperbolic spaces. This projection is computationally effi- cient since it can be expressed by simple linear operations, and, due to the aforementioned equivariance property, it al- lows for weight sharing. The nested hyperbolic space rep- resentation is the core component of our network and there- fore, we first compare this representation – independent of the network – with other dimensionality reduction methods such as tangent PCA, principal geodesic analysis (PGA) and HoroPCA. Based on this equivariant embedding, we develop a novel fully hyperbolic graph convolutional neural network architecture to learn the parameters of the projec- tion. Finally, we present experiments demonstrating com- parative performance of our network on several publicly available data sets. 
    more » « less
  2. We consider two manifestations of non-positive curvature: acylindrical actions (on hyperbolic spaces) and quasigeodesic stability. We study these properties for the class of hierarchically hyperbolic groups, which is a general framework for simultaneously studying many important families of groups, including mapping class groups, right-angled Coxeter groups, most 3 3 –manifold groups, right-angled Artin groups, and many others. A group that admits an acylindrical action on a hyperbolic space may admit many such actions on different hyperbolic spaces. It is natural to try to develop an understanding of all such actions and to search for a “best” one. The set of all cobounded acylindrical actions on hyperbolic spaces admits a natural poset structure, and in this paper we prove that all hierarchically hyperbolic groups admit a unique action which is the largest in this poset. The action we construct is also universal in the sense that every element which acts loxodromically in some acylindrical action on a hyperbolic space does so in this one. Special cases of this result are themselves new and interesting. For instance, this is the first proof that right-angled Coxeter groups admit universal acylindrical actions. The notion of quasigeodesic stability of subgroups provides a natural analogue of quasiconvexity which can be considered outside the context of hyperbolic groups. In this paper, we provide a complete classification of stable subgroups of hierarchically hyperbolic groups, generalizing and extending results that are known in the context of mapping class groups and right-angled Artin groups. Along the way, we provide a characterization of contracting quasigeodesics; interestingly, in this generality the proof is much simpler than in the special cases where it was already known. In the appendix, it is verified that any space satisfying the a priori weaker property of being an “almost hierarchically hyperbolic space” is actually a hierarchically hyperbolic space. The results of the appendix are used to streamline the proofs in the main text. 
    more » « less
  3. null (Ed.)
    Abstract The uniformization and hyperbolization transformations formulated by Bonk et al. in “Uniformizing Gromov Hyperbolic Spaces” , Astérisque, vol 270 (2001), dealt with geometric properties of metric spaces. In this paper we consider metric measure spaces and construct a parallel transformation of measures under the uniformization and hyperbolization procedures. We show that if a locally compact roughly starlike Gromov hyperbolic space is equipped with a measure that is uniformly locally doubling and supports a uniformly local p -Poincaré inequality, then the transformed measure is globally doubling and supports a global p -Poincaré inequality on the corresponding uniformized space. In the opposite direction, we show that such global properties on bounded locally compact uniform spaces yield similar uniformly local properties for the transformed measures on the corresponding hyperbolized spaces. We use the above results on uniformization of measures to characterize when a Gromov hyperbolic space, equipped with a uniformly locally doubling measure supporting a uniformly local p -Poincaré inequality, carries nonconstant globally defined p -harmonic functions with finite p -energy. We also study some geometric properties of Gromov hyperbolic and uniform spaces. While the Cartesian product of two Gromov hyperbolic spaces need not be Gromov hyperbolic, we construct an indirect product of such spaces that does result in a Gromov hyperbolic space. This is done by first showing that the Cartesian product of two bounded uniform domains is a uniform domain. 
    more » « less
  4. Graph convolutional neural networks (GCNs) embed nodes in a graph into Euclidean space, which has been shown to incur a large distortion when embedding real-world graphs with scale-free or hierarchical structure. Hyperbolic geometry offers an exciting alternative, as it enables embeddings with much smaller distortion. However, extending GCNs to hyperbolic geometry presents several unique challenges because it is not clear how to define neural network operations, such as feature transformation and aggregation, in hyperbolic space. Furthermore, since input features are often Euclidean, it is unclear how to transform the features into hyperbolic embeddings with the right amount of curvature. Here we propose Hyperbolic Graph Convolutional Neural Network (HGCN), the first inductive hyperbolic GCN that leverages both the expressiveness of GCNs and hyperbolic geometry to learn inductive node representations for hierarchical and scale-free graphs. We derive GCNs operations in the hyperboloid model of hyperbolic space and map Euclidean input features to embeddings in hyperbolic spaces with different trainable curvature at each layer. Experiments demonstrate that HGCN learns embeddings that preserve hierarchical structure, and leads to improved performance when compared to Euclidean analogs, even with very low dimensional embeddings: compared to state-of-the-art GCNs, HGCN achieves an error reduction of up to 63.1% in ROC AUC for link prediction and of up to 47.5% in F1 score for node classification, also improving state-of-the art on the PubMed dataset. 
    more » « less
  5. Ranzato, M.; Beygelzimer, A.; Dauphin Y.; Liang, P.S.; Wortman Vaughan, J. (Ed.)
    Hyperbolic space is particularly useful for embedding data with hierarchical structure; however, representing hyperbolic space with ordinary floating-point numbers greatly affects the performance due to its \emph{ineluctable} numerical errors. Simply increasing the precision of floats fails to solve the problem and incurs a high computation cost for simulating greater-than-double-precision floats on hardware such as GPUs, which does not support them. In this paper, we propose a simple, feasible-on-GPUs, and easy-to-understand solution for numerically accurate learning on hyperbolic space. We do this with a new approach to represent hyperbolic space using multi-component floating-point (MCF) in the Poincar{\'e} upper-half space model. Theoretically and experimentally we show our model has small numerical error, and on embedding tasks across various datasets, models represented by multi-component floating-points gain more capacity and run significantly faster on GPUs than prior work. 
    more » « less