Abstract We consider the tasks of representing, analysing and manipulating maps between shapes. We model maps as densities over the product manifold of the input shapes; these densities can be treated as scalar functions and therefore are manipulable using the language of signal processing on manifolds. Being a manifold itself, the product space endows the set of maps with a geometry of its own, which we exploit to define map operations in the spectral domain; we also derive relationships with other existing representations (soft maps and functional maps). To apply these ideas in practice, we discretize product manifolds and their Laplace–Beltrami operators, and we introduce localized spectral analysis of the product manifold as a novel tool for map processing. Our framework applies to maps defined between and across 2D and 3D shapes without requiring special adjustment, and it can be implemented efficiently with simple operations on sparse matrices.
more »
« less
Shape and structure preserving differential privacy.
It is common for data structures such as images and shapes of 2D objects to be represented as points on a manifold. The utility of a mechanism to produce sanitized differentially private estimates from such data is intimately linked to how compatible it is with the underlying structure and geometry of the space. In particular, as recently shown, utility of the Laplace mechanism on a positively curved manifold, such as Kendall’s 2D shape space, is significantly influenced by the curvature. Focusing on the problem of sanitizing the Fr\'echet mean of a sample of points on a manifold, we exploit the characterization of the mean as the minimizer of an objective function comprised of the sum of squared distances and develop a K-norm gradient mechanism on Riemannian manifolds that favors values that produce gradients close to the the zero of the objective function. For the case of positively curved manifolds, we describe how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism, and demonstrate this numerically on a dataset of shapes of corpus callosa. Further illustrations of the mechanism’s utility on a sphere and the manifold of symmetric positive definite matrices are also presented.
more »
« less
- Award ID(s):
- 1853209
- PAR ID:
- 10500042
- Publisher / Repository:
- Advances in Neural Information Processing Systems.
- Date Published:
- Journal Name:
- Advances in Neural Information Processing Systems.
- Format(s):
- Medium: X
- Location:
- https://openreview.net/forum?id=7WvNQz9SWH2
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)In this paper, the authors propose a new dimension reduction method for level-set-based topology optimization of conforming thermal structures on free-form surfaces. Both the Hamilton-Jacobi equation and the Laplace equation, which are the two governing PDEs for boundary evolution and thermal conduction, are transformed from the 3D manifold to the 2D rectangular domain using conformal parameterization. The new method can significantly simplify the computation of topology optimization on a manifold without loss of accuracy. This is achieved due to the fact that the covariant derivatives on the manifold can be represented by the Euclidean gradient operators multiplied by a scalar with the conformal mapping. The original governing equations defined on the 3D manifold can now be properly modified and solved on a 2D domain. The objective function, constraint, and velocity field are also equivalently computed with the FEA on the 2D parameter domain with the properly modified form. In this sense, we are solving a 3D topology optimization problem equivalently on the 2D parameter domain. This reduction in dimension can greatly reduce the computing cost and complexity of the algorithm. The proposed concept is proved through two examples of heat conduction on manifolds.more » « less
-
We prove that the Novikov conjecture holds for any discrete group admitting an isometric and metrically proper action on an admissible Hilbert-Hadamard space. Admissible Hilbert-Hadamard spaces are a class of (possibly infinite-dimensional) non-positively curved metric spaces that contain dense sequences of closed convex subsets isometric to Riemannian manifolds. Examples of admissible Hilbert-Hadamard spaces include Hilbert spaces, certain simply connected and non-positively curved Riemannian-Hilbertian manifolds and infinite-dimensional symmetric spaces. Thus our main theorem can be considered as an infinite-dimensional analogue of Kasparov’s theorem on the Novikov conjecture for groups acting properly and isometrically on complete, simply connected and non-positively curved manifolds. As a consequence, we show that the Novikov conjecture holds for geometrically discrete subgroups of the group of volume preserving diffeomorphisms of a closed smooth manifold. This result is inspired by Connes’ theorem that the Novikov conjecture holds for higher signatures associated to the Gelfand-Fuchs classes of groups of diffeormorphisms.more » « less
-
We prove an obstruction at the level of rational cohomology to the existence of positively curved metrics with large symmetry rank. The symmetry rank bound is logarithmic in the dimension of the manifold. As one application, we provide evidence for a generalized conjecture of H. Hopf, which states that no symmetric space of rank at least two admits a metric with positive curvature. Other applications concern product manifolds, connected sums, and manifolds with nontrivial fundamental group.more » « less
-
We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on manifolds. Bayesian optimization algorithms build a surrogate of the objective function by employing Gaussian processes and utilizing the uncertainty in that surrogate by deriving an acquisition function. This acquisition function represents the probability of improvement based on the kernel of the Gaussian process, which guides the search in the optimization process. The critical challenge for designing Bayesian optimization algorithms on manifolds lies in the difficulty of constructing valid covariance kernels for Gaussian processes on general manifolds. Our approach is to employ extrinsic Gaussian processes by first embedding the manifold onto some higher dimensional Euclidean space via equivariant embeddings and then constructing a valid covariance kernel on the image manifold after the embedding. This leads to efficient and scalable algorithms for optimization over complex manifolds. Simulation study and real data analyses are carried out to demonstrate the utilities of our eBO framework by applying the eBO to various optimization problems over manifolds such as the sphere, the Grassmannian, and the manifold of positive definite matrices.more » « less
An official website of the United States government

