Applications in data science, shape analysis, and object classification frequently require comparison of probability distributions defined on different ambient spaces. To accomplish this, one requires a notion of distance on a given class of metric measure spaces—that is, compact metric spaces endowed with probability measures. Such distances are typically defined as comparisons between metric measure space invariants, such as distance distributions (also referred to as shape distributions, distance histograms, or shape contexts in the literature). Generally, distances defined in terms of distance distributions are actually pseudometrics, in that they may vanish when comparing nonisomorphic spaces. The goal of this paper is to set up a formal framework for assessing the discrimininative power of distance distributions, that is, the extent to which these pseudometrics fail to define proper metrics. We formulate several precise inverse problems in terms of these invariants and answer them in several categories of metric measure spaces, including the category of plane curves, where we give a counterexample to the curve histogram conjecture of Brinkman and Olver, the categories of embedded and Riemannian manifolds, where we obtain sphere rigidity results, and the category of metric graphs, where we obtain a local injectivity result along the lines of classical work of Boutin and Kemper on point cloud configurations. The inverse problems are further contextualized by the introduction of a variant of the Gromov–Wasserstein distance on the space of metric measure spaces, which is inspired by the original Monge formulation of optimal transport.
Graphs in metric spaces appear in a wide range of data sets, and there is a large body of work focused on comparing, matching, or analyzing collections of graphs in different ambient spaces. In this survey, we provide an overview of a diverse collection of distance measures that can be defined on the set of finite graphs immersed (and in some cases, embedded) in a metric space. For each of the distance measures, we recall their definitions and investigate which of the properties of a metric they satisfy. Furthermore we compare the distance measures based on these properties and discuss their computational complexity.
more » « less NSFPAR ID:
 10389784
 Publisher / Repository:
 Springer Science + Business Media
 Date Published:
 Journal Name:
 La Matematica
 Volume:
 2
 Issue:
 1
 ISSN:
 27309657
 Page Range / eLocation ID:
 p. 197222
 Format(s):
 Medium: X
 Sponsoring Org:
 National Science Foundation
More Like this

Abstract 
Abstract To characterize the location (mean, median) of a set of graphs, one needs a notion of centrality that has been adapted to metric spaces. A standard approach is to consider the Fréchet mean. In practice, computing the Fréchet mean for sets of large graphs presents many computational issues. In this work, we suggest a method that may be used to compute the Fréchet mean for sets of graphs which is metric independent. We show that the technique proposed can be used to determine the Fréchet mean when considering the Hamming distance or a distance defined by the difference between the spectra of the adjacency matrices of the graphs.

Abstract There are many ways of measuring and modeling taildependence in random vectors: from the general framework of multivariate regular variation and the flexible class of maxstable vectors down to simple and concise summary measures like the matrix of bivariate taildependence coefficients. This paper starts by providing a review of existing results from a unifying perspective, which highlights connections between extreme value theory and the theory of cuts and metrics. Our approach leads to some new findings in both areas with some applications to current topics in risk management.
We begin by using the framework of multivariate regular variation to show that extremal coefficients, or equivalently, the higherorder taildependence coefficients of a random vector can simply be understood in terms of random exceedance sets, which allows us to extend the notion of Bernoulli compatibility. In the special but important case of bivariate taildependence, we establish a correspondence between taildependence matrices and
 and$$L^1$$ ${L}^{1}$ embeddable finite metric spaces via the spectral distance, which is a metric on the space of jointly 1Fréchet random variables. Namely, the coefficients of the cutdecomposition of the spectral distance and of the TawnMolchanov maxstable model realizing the corresponding bivariate extremal dependence coincide. We show that line metrics are rigid and if the spectral distance corresponds to a line metric, the higher order taildependence is determined by the bivariate taildependence matrix.$$\ell _1$$ ${\ell}_{1}$Finally, the correspondence between
embeddable metric spaces and taildependence matrices allows us to revisit the realizability problem, i.e. checking whether a given matrix is a valid taildependence matrix. We confirm a conjecture of Shyamalkumar and Tao (2020) that this problem is NPcomplete.$$\ell _1$$ ${\ell}_{1}$ 
Oliva, Gabriele (Ed.)We define a new family of similarity and distance measures on graphs, and explore their theoretical properties in comparison to conventional distance metrics. These measures are defined by the solution(s) to an optimization problem which attempts find a map minimizing the discrepancy between two graph Laplacian exponential matrices, under normpreserving and sparsity constraints. Variants of the distance metric are introduced to consider such optimized maps under sparsity constraints as well as fixed timescaling between the two Laplacians. The objective function of this optimization is multimodal and has discontinuous slope, and is hence difficult for univariate optimizers to solve. We demonstrate a novel procedure for efficiently calculating these optima for two of our distance measure variants. We present numerical experiments demonstrating that (a) upper bounds of our distance metrics can be used to distinguish between lineages of related graphs; (b) our procedure is faster at finding the required optima, by as much as a factor of 10 3 ; and (c) the upper bounds satisfy the triangle inequality exactly under some assumptions and approximately under others. We also derive an upper bound for the distance between two graph products, in terms of the distance between the two pairs of factors. Additionally, we present several possible applications, including the construction of infinite “graph limits” by means of Cauchy sequences of graphs related to one another by our distance measure.more » « less

null (Ed.)Abstract The uniformization and hyperbolization transformations formulated by Bonk et al. in “Uniformizing Gromov Hyperbolic Spaces” , Astérisque, vol 270 (2001), dealt with geometric properties of metric spaces. In this paper we consider metric measure spaces and construct a parallel transformation of measures under the uniformization and hyperbolization procedures. We show that if a locally compact roughly starlike Gromov hyperbolic space is equipped with a measure that is uniformly locally doubling and supports a uniformly local p Poincaré inequality, then the transformed measure is globally doubling and supports a global p Poincaré inequality on the corresponding uniformized space. In the opposite direction, we show that such global properties on bounded locally compact uniform spaces yield similar uniformly local properties for the transformed measures on the corresponding hyperbolized spaces. We use the above results on uniformization of measures to characterize when a Gromov hyperbolic space, equipped with a uniformly locally doubling measure supporting a uniformly local p Poincaré inequality, carries nonconstant globally defined p harmonic functions with finite p energy. We also study some geometric properties of Gromov hyperbolic and uniform spaces. While the Cartesian product of two Gromov hyperbolic spaces need not be Gromov hyperbolic, we construct an indirect product of such spaces that does result in a Gromov hyperbolic space. This is done by first showing that the Cartesian product of two bounded uniform domains is a uniform domain.more » « less