We construct new families of
- NSF-PAR ID:
- 10354701
- Date Published:
- Journal Name:
- Numerische Mathematik
- Volume:
- 150
- Issue:
- 4
- ISSN:
- 0029-599X
- Page Range / eLocation ID:
- 929 to 974
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract direct serendipity anddirect mixed finite elements on general planar, strictly convex polygons that areH 1andH (div) conforming, respectively, and possess optimal order of accuracy for any order. They have a minimal number of degrees of freedom subject to the conformity and accuracy constraints. The name arises because the shape functions are defineddirectly on the physical elements, i.e., without using a mapping from a reference element. The finite element shape functions are defined to be the full spaces of scalar or vector polynomials plus a space of supplemental functions. The direct serendipity elements are the precursors of the direct mixed elements in a de Rham complex. The convergence properties of the finite elements are shown under a regularity assumption on the shapes of the polygons in the mesh, as well as some mild restrictions on the choices one can make in the construction of the supplemental functions. Numerical experiments on various meshes exhibit the performance of these new families of finite elements. -
We present an implementation of the trimmed serendipity finite element family, using the open-source finite element package Firedrake. The new elements can be used seamlessly within the software suite for problems requiring H 1 , H (curl), or H (div)-conforming elements on meshes of squares or cubes. To test how well trimmed serendipity elements perform in comparison to traditional tensor product elements, we perform a sequence of numerical experiments including the primal Poisson, mixed Poisson, and Maxwell cavity eigenvalue problems. Overall, we find that the trimmed serendipity elements converge, as expected, at the same rate as the respective tensor product elements, while being able to offer significant savings in the time or memory required to solve certain problems.more » « less
-
We introduce the family of trimmed serendipity finite element differential form spaces, defined on cubical meshes in any number of dimensions, for any polynomial degree, and for any form order. The relation between the trimmed serendipity family and the (non-trimmed) serendipity family developed by Arnold and Awanou [Math. Comp. 83(288) 2014] is analogous to the relation between the trimmed and (non-trimmed) polynomial finite element differential form families on simplicial meshes from finite element exterior calculus. We provide degrees of freedom in the general setting and prove that they are unisolvent for the trimmed serendipity spaces. The sequence of trimmed serendipity spaces with a fixed polynomial order r provides an explicit example of a system described by Christiansen and Gillette [ESAIM:M2AN 50(3) 2016], namely, a minimal compatible finite element system on squares or cubes containing order r-1 polynomial differential forms.more » « less
-
Finite element methods for electromagnetic problems modeled by Maxwell-type equations are highly sensitive to the conformity of approximation spaces, and non-conforming methods may cause loss of convergence. This fact leads to an essential obstacle for almost all the interface-unfitted mesh methods in the literature regarding the application to electromagnetic interface problems, as they are based on non-conforming spaces. In this work, a novel immersed virtual element method for solving a three-dimensional (3D) H(curl) interface problem is developed, and the motivation is to combine the conformity of virtual element spaces and robust approximation capabilities of immersed finite element spaces. The proposed method is able to achieve optimal convergence. To develop a systematic framework, the [Formula: see text], H(curl) and H(div) interface problems and their corresponding problem-orientated immersed virtual element spaces are considered all together. In addition, the de Rham complex will be established based on which the Hiptmair–Xu (HX) preconditioner can be used to develop a fast solver for the H(curl) interface problem.more » « less
-
Abstract We consider the problem of covering multiple submodular constraints. Given a finite ground set
N , a weight function ,$$w: N \rightarrow \mathbb {R}_+$$ r monotone submodular functions over$$f_1,f_2,\ldots ,f_r$$ N and requirements the goal is to find a minimum weight subset$$k_1,k_2,\ldots ,k_r$$ such that$$S \subseteq N$$ for$$f_i(S) \ge k_i$$ . We refer to this problem as$$1 \le i \le r$$ Multi-Submod-Cover and it was recently considered by Har-Peled and Jones (Few cuts meet many point sets. CoRR.arxiv:abs1808.03260 Har-Peled and Jones 2018) who were motivated by an application in geometry. Even with$$r=1$$ Multi-Submod-Cover generalizes the well-known Submodular Set Cover problem (Submod-SC ), and it can also be easily reduced toSubmod-SC . A simple greedy algorithm gives an approximation where$$O(\log (kr))$$ and this ratio cannot be improved in the general case. In this paper, motivated by several concrete applications, we consider two ways to improve upon the approximation given by the greedy algorithm. First, we give a bicriteria approximation algorithm for$$k = \sum _i k_i$$ Multi-Submod-Cover that covers each constraint to within a factor of while incurring an approximation of$$(1-1/e-\varepsilon )$$ in the cost. Second, we consider the special case when each$$O(\frac{1}{\epsilon }\log r)$$ is a obtained from a truncated coverage function and obtain an algorithm that generalizes previous work on partial set cover ($$f_i$$ Partial-SC ), covering integer programs (CIPs ) and multiple vertex cover constraints Bera et al. (Theoret Comput Sci 555:2–8 Bera et al. 2014). Both these algorithms are based on mathematical programming relaxations that avoid the limitations of the greedy algorithm. We demonstrate the implications of our algorithms and related ideas to several applications ranging from geometric covering problems to clustering with outliers. Our work highlights the utility of the high-level model and the lens of submodularity in addressing this class of covering problems.