The complex Green operator $$\mathcal{G}$$ on CR manifolds is the inverse of the Kohn-Laplacian $$\square_b$$ on the orthogonal complement of its kernel. In this note, we prove Schatten and Sobolev estimates for $$\mathcal{G}$$ on the unit sphere $$\mathbb{S}^{2n-1}\subset \mathbb{C}^n$$. We obtain these estimates by using the spectrum of $$\boxb$$ and the asymptotics of the eigenvalues of the usual Laplace-Beltrami operator.
more »
« less
Radial Basis Approximation of Tensor Fields on Manifolds: From Operator Estimation to Manifold Learning
In this paper, we study the Radial Basis Function (RBF) approximation to differential operators on smooth tensor fields defined on closed Riemannian submanifolds of Euclidean space, identified by randomly sampled point cloud data. The formulation in this paper leverages a fundamental fact that the covariant derivative on a submanifold is the projection of the directional derivative in the ambient Euclidean space onto the tangent space of the submanifold. To differentiate a test function (or vector field) on the submanifold with respect to the Euclidean metric, the RBF interpolation is applied to extend the function (or vector field) in the ambient Euclidean space. When the manifolds are unknown, we develop an improved second-order local SVD technique for estimating local tangent spaces on the manifold. When the classical pointwise non-symmetric RBF formulation is used to solve Laplacian eigenvalue problems, we found that while accurate estimation of the leading spectra can be obtained with large enough data, such an approximation often produces irrelevant complex-valued spectra (or pollution) as the true spectra are real-valued and positive. To avoid such an issue, we introduce a symmetric RBF discrete approximation of the Laplacians induced by a weak formulation on appropriate Hilbert spaces. Unlike the non-symmetric approximation, this formulation guarantees non-negative real-valued spectra and the orthogonality of the eigenvectors. Theoretically, we establish the convergence of the eigenpairs of both the Laplace-Beltrami operator and Bochner Laplacian for the symmetric formulation in the limit of large data with convergence rates. Numerically, we provide supporting examples for approximations of the Laplace-Beltrami operator and various vector Laplacians, including the Bochner, Hodge, and Lichnerowicz Laplacians.
more »
« less
- Award ID(s):
- 2207328
- PAR ID:
- 10522931
- Editor(s):
- Mahoney, Michael
- Publisher / Repository:
- Microtome Publishing
- Date Published:
- Journal Name:
- Journal of machine learning research
- ISSN:
- 1532-4435
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract We present a mixed finite element method for approximating a fourth-order elliptic partial differential equation (PDE), the Kirchhoff plate equation, on a surface embedded in $${\mathbb {R}}^{3}$$, with or without boundary. Error estimates are given in mesh-dependent norms that account for the surface approximation and the approximation of the surface PDE. The method is built on the classic Hellan–Herrmann–Johnson method (for flat domains), and convergence is established for $$C^{k+1}$$ surfaces, with degree $$k$$ (Lagrangian, parametrically curved) approximation of the surface, for any $$k \geqslant 1$$. Mixed boundary conditions are allowed, including clamped, simply-supported and free conditions; if free conditions are present then the surface must be at least $$C^{2,1}$$. The framework uses tools from differential geometry and is directly related to the seminal work of Dziuk, G. (1988) Finite elements for the Beltrami operator on arbitrary surfaces. Partial Differential Equations and Calculus of Variations, vol. 1357 (S. Hildebrandt & R. Leis eds). Berlin, Heidelberg: Springer, pp. 142–155. for approximating the Laplace–Beltrami equation. The analysis here is the first to handle the full surface Hessian operator directly. Numerical examples are given on nontrivial surfaces that demonstrate our convergence estimates. In addition, we show how the surface biharmonic equation can be solved with this method.more » « less
-
Carreira-Perpinan, Miguel (Ed.)In this work we study statistical properties of graph-based algorithms for multi-manifold clustering (MMC). In MMC the goal is to retrieve the multi-manifold structure underlying a given Euclidean data set when this one is assumed to be obtained by sampling a distribution on a union of manifolds M = M1 ∪ · · · ∪ MN that may intersect with each other and that may have different dimensions. We investigate sufficient conditions that similarity graphs on data sets must satisfy in order for their corresponding graph Laplacians to capture the right geometric information to solve the MMC problem. Precisely, we provide high probability error bounds for the spectral approximation of a tensorized Laplacian on M with a suitable graph Laplacian built from the observations; the recovered tensorized Laplacian contains all geometric information of all the individual underlying manifolds. We provide an example of a family of similarity graphs, which we call annular proximity graphs with angle constraints, satisfying these sufficient conditions. We contrast our family of graphs with other constructions in the literature based on the alignment of tangent planes. Extensive numerical experiments expand the insights that our theory provides on the MMC problem.more » « less
-
null (Ed.)We establish the convergence of the forward-backward splitting algorithm based on Bregman distances for the sum of two monotone operators in reflexive Banach spaces. Even in Euclidean spaces, the convergence of this algorithm has so far been proved only in the case of minimization problems. The proposed framework features Bregman distances that vary over the iterations and a novel assumption on the single-valued operator that captures various properties scattered in the literature. In the minimization setting, we obtain rates that are sharper than existing ones.more » « less
-
Supervised operator learning centers on the use of training data, in the form of input-output pairs, to estimate maps between infinite-dimensional spaces. It is emerging as apowerful tool to complement traditional scientific computing, which may often be framedin terms of operators mapping between spaces of functions. Building on the classical ran-dom features methodology for scalar regression, this paper introduces the function-valuedrandom features method. This leads to a supervised operator learning architecture thatis practical for nonlinear problems yet is structured enough to facilitate efficient trainingthrough the optimization of a convex, quadratic cost. Due to the quadratic structure, thetrained model is equipped with convergence guarantees and error and complexity bounds,properties that are not readily available for most other operator learning architectures. Atits core, the proposed approach builds a linear combination of random operators. Thisturns out to be a low-rank approximation of an operator-valued kernel ridge regression al-gorithm, and hence the method also has strong connections to Gaussian process regression.The paper designs function-valued random features that are tailored to the structure oftwo nonlinear operator learning benchmark problems arising from parametric partial differ-ential equations. Numerical results demonstrate the scalability, discretization invariance,and transferability of the function-valued random features method.more » « less
An official website of the United States government

