skip to main content


Title: Decoupling Simulation Accuracy from Mesh Quality
For a given PDE problem, three main factors affect the accuracy of FEM solutions: basis order, mesh resolution, and mesh element quality. The first two factors are easy to control, while controlling element shape quality is a challenge, with fundamental limitations on what can be achieved. We propose to use p-refinement (increasing element degree) to decouple the approximation error of the finite element method from the domain mesh quality for elliptic PDEs. Our technique produces an accurate solution even on meshes with badly shaped elements, with a slightly higher running time due to the higher cost of high-order elements. We demonstrate that it is able to automatically adapt the basis to badly shaped elements, ensuring an error consistent with high-quality meshing, without any per-mesh parameter tuning. Our construction reduces to traditional fixed-degree FEM methods on high-quality meshes with identical performance. Our construction decreases the burden on meshing algorithms, reducing the need for often expensive mesh optimization and automatically compensates for badly shaped elements, which are present due to boundary con- straints or limitations of current meshing methods. By tackling mesh gen- eration and finite element simulation jointly, we obtain a pipeline that is both more efficient and more robust than combinations of existing state of the art meshing and FEM algorithms.  more » « less
Award ID(s):
1835712 1652515
NSF-PAR ID:
10080686
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
ACM transactions on graphics
ISSN:
0730-0301
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The Finite Element Method (FEM) is widely used to solve discrete Partial Differential Equations (PDEs) in engineering and graphics applications. The popularity of FEM led to the development of a large family of variants, most of which require a tetrahedral or hexahedral mesh to construct the basis. While the theoretical properties of FEM basis (such as convergence rate, stability, etc.) are well understood under specific assumptions on the mesh quality, their practical performance, influenced both by the choice of the basis construction and quality of mesh generation, have not been systematically documented for large collections of automatically meshed 3D geometries. We introduce a set of benchmark problems involving most commonly solved elliptic PDEs, starting from simple cases with an analytical solution, moving to commonly used test problem setups, and using manufactured solutions for thousands of real-world, automatically meshed geometries. For all these cases, we use state-of-the-art meshing tools to create both tetrahedral and hexahedral meshes, and compare the performance of different element types for common elliptic PDEs. The goal of this benchmark is to enable comparison of complete FEM pipelines, from mesh generation to algebraic solver, and exploration of relative impact of different factors on the overall system performance. As a specific application of our geometry and benchmark dataset, we explore the question of relative advantages of unstructured (triangular/ tetrahedral) and structured (quadrilateral/hexahedral) discretizations. We observe that for Lagrange-type elements, while linear tetrahedral elements perform poorly, quadratic tetrahedral elements perform equally well or outperform hexahedral elements for our set of problems and currently available mesh generation algorithms. This observation suggests that for common problems in structural analysis, thermal analysis, and low Reynolds number flows, high-quality results can be obtained with unstructured tetrahedral meshes, which can be created robustly and automatically. We release the description of the benchmark problems, meshes, and reference implementation of our testing infrastructure to enable statistically significant comparisons between different FE methods, which we hope will be helpful in the development of new meshing and FEA techniques. 
    more » « less
  2. Osteoarthritis of the knee is increasingly prevalent as our population ages, representing an increasing financial burden, and severely impacting quality of life. The invasiveness of in vivo procedures and the high cost of cadaveric studies has left computational tools uniquely suited to study knee biomechanics. Developments in deep learning have great potential for efficiently generating large-scale datasets to enable researchers to perform population-sized investigations, but the time and effort associated with producing robust hexahedral meshes has been a limiting factor in expanding finite element studies to encompass a population. Here we developed a fully automated pipeline capable of taking magnetic resonance knee images and producing a working finite element simulation. We trained an encoder-decoder convolutional neural network to perform semantic image segmentation on the Imorphics dataset provided through the Osteoarthritis Initiative. The Imorphics dataset contained 176 image sequences with varying levels of cartilage degradation. Starting from an open-source swept-extrusion meshing algorithm, we further developed this algorithm until it could produce high quality meshes for every sequence and we applied a template-mapping procedure to automatically place soft-tissue attachment points. The meshing algorithm produced simulation-ready meshes for all 176 sequences, regardless of the use of provided (manually reconstructed) or predicted (automatically generated) segmentation labels. The average time to mesh all bones and cartilage tissues was less than 2 min per knee on an AMD Ryzen 5600X processor, using a parallel pool of three workers for bone meshing, followed by a pool of four workers meshing the four cartilage tissues. Of the 176 sequences with provided segmentation labels, 86% of the resulting meshes completed a simulated flexion-extension activity. We used a reserved testing dataset of 28 sequences unseen during network training to produce simulations derived from predicted labels. We compared tibiofemoral contact mechanics between manual and automated reconstructions for the 24 pairs of successful finite element simulations from this set, resulting in mean root-mean-squared differences under 20% of their respective min-max norms. In combination with further advancements in deep learning, this framework represents a feasible pipeline to produce population sized finite element studies of the natural knee from subject-specific models. 
    more » « less
  3. Abstract We combine theoretical results from polytope domain meshing, generalized barycentric coordinates, and finite element exterior calculus to construct scalar- and vector-valued basis functions for conforming finite element methods on generic convex polytope meshes in dimensions 2 and 3. Our construction recovers well-known bases for the lowest order Nédélec, Raviart–Thomas, and Brezzi–Douglas–Marini elements on simplicial meshes and generalizes the notion of Whitney forms to non-simplicial convex polygons and polyhedra. We show that our basis functions lie in the correct function space with regards to global continuity and that they reproduce the requisite polynomial differential forms described by finite element exterior calculus. We present a method to count the number of basis functions required to ensure these two key properties. 
    more » « less
  4. Faithful, accurate, and successful cardiac biomechanics and electrophysiological simulations require patient-specific geometric models of the heart. Since the cardiac geometry consists of highly-curved boundaries, the use of high-order meshes with curved elements would ensure that the various curves and features present in the cardiac geometry are well-captured and preserved in the corresponding mesh. Most other existing mesh generation techniques require computer-aided design files to represent the geometric boundary, which are often not available for biomedical applications. Unlike such methods, our technique takes a high-order surface mesh, generated from patient medical images, as input and generates a high-order volume mesh directly from the curved surface mesh. In this paper, we use our direct high-order curvilinear tetrahedral mesh generation method [1] to generate several second-order cardiac meshes. Our meshes include the left ventricle myocardia of a healthy heart and hearts with dilated and hypertrophic cardiomyopathy. We show that our high-order cardiac meshes do not contain inverted elements and are of sufficiently high quality for use in cardiac finite element simulations. 
    more » « less
  5. Purpose The purpose of this paper is as follows: to significantly reduce the computation time (by a factor of 1,000 and more) compared to known numerical techniques for real-world problems with complex interfaces; and to simplify the solution by using trivial unfitted Cartesian meshes (no need in complicated mesh generators for complex geometry). Design/methodology/approach This study extends the recently developed optimal local truncation error method (OLTEM) for the Poisson equation with constant coefficients to a much more general case of discontinuous coefficients that can be applied to domains with different material properties (e.g. different inclusions, multi-material structural components, etc.). This study develops OLTEM using compact 9-point and 25-point stencils that are similar to those for linear and quadratic finite elements. In contrast to finite elements and other known numerical techniques for interface problems with conformed and unfitted meshes, OLTEM with 9-point and 25-point stencils and unfitted Cartesian meshes provides the 3-rd and 11-th order of accuracy for irregular interfaces, respectively; i.e. a huge increase in accuracy by eight orders for the new 'quadratic' elements compared to known techniques at similar computational costs. There are no unknowns on interfaces between different materials; the structure of the global discrete system is the same for homogeneous and heterogeneous materials (the difference in the values of the stencil coefficients). The calculation of the unknown stencil coefficients is based on the minimization of the local truncation error of the stencil equations and yields the optimal order of accuracy of OLTEM at a given stencil width. The numerical results with irregular interfaces show that at the same number of degrees of freedom, OLTEM with the 9-points stencils is even more accurate than the 4-th order finite elements; OLTEM with the 25-points stencils is much more accurate than the 7-th order finite elements with much wider stencils and conformed meshes. Findings The significant increase in accuracy for OLTEM by one order for 'linear' elements and by 8 orders for 'quadratic' elements compared to that for known techniques. This will lead to a huge reduction in the computation time for the problems with complex irregular interfaces. The use of trivial unfitted Cartesian meshes significantly simplifies the solution and reduces the time for the data preparation (no need in complicated mesh generators for complex geometry). Originality/value It has been never seen in the literature such a huge increase in accuracy for the proposed technique compared to existing methods. Due to a high accuracy, the proposed technique will allow the direct solution of multiscale problems without the scale separation. 
    more » « less