skip to main content


Title: Distance-preserving manifold denoising for data-driven mechanics
This article introduces an isometric manifold embedding data-driven paradigm designed to enable model-free simulations with noisy data sampled from a constitutive manifold. The proposed data-driven approach iterates between a global optimization problem that seeks admissible solutions for the balance principle and a local optimization problem that finds the closest point projection of the Euclidean space that isometrically embeds a nonlinear constitutive manifold. To de-noise the database, a geometric autoencoder is introduced such that the encoder first learns to create an approximated embedding that maps the underlying low-dimensional structure of the high-dimensional constitutive manifold onto a flattened manifold with less curvature. We then obtain the noise-free constitutive responses by projecting data onto a denoised latent space that is completely flat by assuming that the noise and the underlying constitutive signal are orthogonal to each other. Consequently, a projection from the conservative manifold onto this de-noised constitutive latent space enables us to complete the local optimization step of the data-driven paradigm. Finally, to decode the data expressed in the latent space without reintroducing noise, we impose a set of isometry constraints while training the autoencoder such that the nonlinear mapping from the latent space to the reconstructed constituent manifold is distance-preserving. Numerical examples are used to both validate the implementation and demonstrate the accuracy, robustness, and limitations of the proposed paradigm.  more » « less
Award ID(s):
1846875
NSF-PAR ID:
10487110
Author(s) / Creator(s):
;
Publisher / Repository:
ScienceDirect
Date Published:
Journal Name:
Computer Methods in Applied Mechanics and Engineering
Volume:
405
Issue:
C
ISSN:
0045-7825
Page Range / eLocation ID:
115857
Subject(s) / Keyword(s):
["Data-driven mechanics","Manifold","de-noising","Geodesic","Constitutive manifold","Autoencoder","Isometry"]
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Generative adversarial networks (GANs) have attracted huge attention due to its capability to generate visual realistic images. However, most of the existing models suffer from the mode collapse or mode mixture problems. In this work, we give a theoretic explanation of the both problems by Figalli’s regularity theory of optimal transportation maps. Basically, the generator compute the transportation maps between the white noise distributions and the data distributions, which are in general discontinuous. However, DNNs can only represent continuous maps. This intrinsic conflict induces mode collapse and mode mixture. In order to tackle the both problems, we explicitly separate the manifold embedding and the optimal transportation; the first part is carried out using an autoencoder to map the images onto the latent space; the second part is accomplished using a GPU-based convex optimization to find the discontinuous transportation maps. Composing the extended OT map and the decoder, we can finally generate new images from the white noise. This AE-OT model avoids representing discontinuous maps by DNNs, therefore effectively prevents mode collapse and mode mixture. 
    more » « less
  2. Most applications of multispectral imaging are explicitly or implicitly dependent on the dimensionality and topology of the spectral mixing space. Mixing space characterization refers to the identification of salient properties of the set of pixel reflectance spectra comprising an image (or compilation of images). The underlying premise is that this set of spectra may be described as a low dimensional manifold embedded in a high dimensional vector space. Traditional mixing space characterization uses the linear dimensionality reduction offered by Principal Component Analysis to find projections of pixel spectra onto orthogonal linear subspaces, prioritized by variance. Here, we consider the potential for recent advances in nonlinear dimensionality reduction (specifically, manifold learning) to contribute additional useful information for multispectral mixing space characterization. We integrate linear and nonlinear methods through a novel approach called Joint Characterization (JC). JC is comprised of two components. First, spectral mixture analysis (SMA) linearly projects the high-dimensional reflectance vectors onto a 2D subspace comprising the primary mixing continuum of substrates, vegetation, and dark features (e.g., shadow and water). Second, manifold learning nonlinearly maps the high-dimensional reflectance vectors into a low-D embedding space while preserving manifold topology. The SMA output is physically interpretable in terms of material abundances. The manifold learning output is not generally physically interpretable, but more faithfully preserves high dimensional connectivity and clustering within the mixing space. Used together, the strengths of SMA may compensate for the limitations of manifold learning, and vice versa. Here, we illustrate JC through application to thematic compilations of 90 Sentinel-2 reflectance images selected from a diverse set of biomes and land cover categories. Specifically, we use globally standardized Substrate, Vegetation, and Dark (S, V, D) endmembers (EMs) for SMA, and Uniform Manifold Approximation and Projection (UMAP) for manifold learning. The value of each (SVD and UMAP) model is illustrated, both separately and jointly. JC is shown to successfully characterize both continuous gradations (spectral mixing trends) and discrete clusters (land cover class distinctions) within the spectral mixing space of each land cover category. These features are not clearly identifiable from SVD fractions alone, and not physically interpretable from UMAP alone. Implications are discussed for the design of models which can reliably extract and explainably use high-dimensional spectral information in spatially mixed pixels—a principal challenge in optical remote sensing.

     
    more » « less
  3. De Lorenzis, Laura ; Papadrakakis, Manolis ; Zohdi, Tarek I. (Ed.)
    This paper presents a graph-manifold iterative algorithm to predict the configurations of geometrically exact shells subjected to external loading. The finite element solutions are first stored in a weighted graph where each graph node stores the nodal displacement and nodal director. This collection of solutions is embedded onto a low-dimensional latent space through a graph isomorphism encoder. This graph embedding step reduces the dimensionality of the nonlinear data and makes it easier for the response surface to be constructed. The decoder, in return, converts an element in the latent space back to a weighted graph that represents a finite element solution. As such, the deformed configuration of the shell can be obtained by decoding the predictions in the latent space without running extra finite element simulations. For engineering applications where the shell is often subjected to concentrated loads or a local portion of the shell structure is of particular interest, we use the solutions stored in a graph to reconstruct a smooth manifold where the balance laws are enforced to control the curvature of the shell. The resultant computer algorithm enjoys both the speed of the nonlinear dimensional reduced solver and the fidelity of the solutions at locations where it matters. 
    more » « less
  4. Hyperbolic neural networks have been popular in the re- cent past due to their ability to represent hierarchical data sets effectively and efficiently. The challenge in develop- ing these networks lies in the nonlinearity of the embed- ding space namely, the Hyperbolic space. Hyperbolic space is a homogeneous Riemannian manifold of the Lorentz group which is a semi-Riemannian manifold, i.e. a mani- fold equipped with an indefinite metric. Most existing meth- ods (with some exceptions) use local linearization to de- fine a variety of operations paralleling those used in tra- ditional deep neural networks in Euclidean spaces. In this paper, we present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) fol- lowed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space. The novelty here lies in the projection which is designed to project data on to a lower- dimensional embedded hyperbolic space and hence leads to a nested hyperbolic space representation independently useful for dimensionality reduction. The main theoretical contribution is that the proposed embedding is proved to be isometric and equivariant under the Lorentz transforma- tions, which are the natural isometric transformations in hyperbolic spaces. This projection is computationally effi- cient since it can be expressed by simple linear operations, and, due to the aforementioned equivariance property, it al- lows for weight sharing. The nested hyperbolic space rep- resentation is the core component of our network and there- fore, we first compare this representation – independent of the network – with other dimensionality reduction methods such as tangent PCA, principal geodesic analysis (PGA) and HoroPCA. Based on this equivariant embedding, we develop a novel fully hyperbolic graph convolutional neural network architecture to learn the parameters of the projec- tion. Finally, we present experiments demonstrating com- parative performance of our network on several publicly available data sets. 
    more » « less
  5. Kernel dimensionality reduction (KDR) algorithms find a low dimensional representation of the original data by optimizing kernel dependency measures that are capable of capturing nonlinear relationships. The standard strategy is to first map the data into a high dimensional feature space using kernels prior to a projection onto a low dimensional space. While KDR methods can be easily solved by keeping the most dominant eigenvectors of the kernel matrix, its features are no longer easy to interpret. Alternatively, Interpretable KDR (IKDR) is different in that it projects onto a subspace \textit{before} the kernel feature mapping, therefore, the projection matrix can indicate how the original features linearly combine to form the new features. Unfortunately, the IKDR objective requires a non-convex manifold optimization that is difficult to solve and can no longer be solved by eigendecomposition. Recently, an efficient iterative spectral (eigendecomposition) method (ISM) has been proposed for this objective in the context of alternative clustering. However, ISM only provides theoretical guarantees for the Gaussian kernel. This greatly constrains ISM's usage since any kernel method using ISM is now limited to a single kernel. This work extends the theoretical guarantees of ISM to an entire family of kernels, thereby empowering ISM to solve any kernel method of the same objective. In identifying this family, we prove that each kernel within the family has a surrogate Φ matrix and the optimal projection is formed by its most dominant eigenvectors. With this extension, we establish how a wide range of IKDR applications across different learning paradigms can be solved by ISM. To support reproducible results, the source code is made publicly available on \url{https://github.com/ANONYMIZED} 
    more » « less