skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on December 9, 2025

Title: Comparison of variational discretizations for a convection-diffusion problem
For a model convection-diffusion problem, we obtain new error estimates for a general upwinding finite element discretization based on bubble modification of the test space. The key analysis tool is finding representations of the optimal norms on the trial spaces at the continuous and discrete levels. We analyze and compare three methods: the standard linear discretization, the saddle point least square and the upwinding Petrov-Galerkin methods. We conclude that the bubble upwinding Petrov-Galerkin method is the most performant discretization for the one dimensional model. Our results for the model convection-diffusion problem can be extended for creating new and efficient discretizations for the multidimensional cases.  more » « less
Award ID(s):
2011615
PAR ID:
10560479
Author(s) / Creator(s):
; ;
Corporate Creator(s):
Editor(s):
Beznea, Lucian; Putinar, Mihai
Publisher / Repository:
Romanian Academy, Publishing House of the Romanian Academy
Date Published:
Journal Name:
Revue Roumaine Mathématiques Pures et Appliquées
Volume:
Tome LXIX
Issue:
3-4
ISSN:
0035-3965
Page Range / eLocation ID:
327-351
Subject(s) / Keyword(s):
Petrov Galerkin upwinding convection dominated problem singularly perturbed problems
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We consider a model convection-diffusion problem and present useful connections between the finite differences and finite element discretization methods. We introduce a general upwinding Petrov-Galerkin discretization based on bubble modification of the test space and connect the method with the general upwinding approach used in finite difference discretization. We write the finite difference and the finite element systems such that the two corresponding linear systems have the same stiffness matrices, and compare the right hand side load vectors for the two methods. This new approach allows for improving well known upwinding finite difference methods and for obtaining new error estimates. We prove that the exponential bubble Petrov-Galerkin discretization can recover the interpolant of the exact solution. As a consequence, we estimate the closeness of the related finite difference solutions to the interpolant. The ideas we present in this work, can lead to building efficient new discretization methods for multidimensional convection dominated problems. 
    more » « less
  2. We consider a model convection-diffusion problem and present our recent analysis and numerical results regarding mixed finite element formulation and discretization in the singular perturbed case when the convection term dominates the problem. Using the concepts of optimal norm and saddle point reformulation, we found new error estimates for the case of uniform meshes. We compare the standard linear Galerkin discretization to a saddle point least square discretization that uses quadratic test functions, and explain the non-physical oscillations of the discrete solutions. We also relate a known upwinding Petrov–Galerkin method and the stream-line diffusion discretization method, by emphasizing the resulting linear systems and by comparing appropriate error norms. The results can be extended to the multidimensional case in order to find efficient approximations for more general singular perturbed problems including convection dominated models. 
    more » « less
  3. This paper is a continuation of Melenk et al., "Stability analysis for electromagnetic waveguides. Part 1: acoustic and homogeneous electromagnetic waveguides" (2023), extending the stability results for homogeneous electromagnetic (EM) waveguides to the non-homogeneous case. The analysis is done using perturbation techniques for self-adjoint operators eigenproblems. We show that the non-homogeneous EM waveguide problem is well-posed with the stability constant scaling linearly with waveguide length L. The results provide a basis for proving convergence of a Discontinuous Petrov-Galerkin (DPG) discretization based on a full envelope ansatz, and the ultraweak variational formulation for the resulting modified system of Maxwell equations, see Part 1. 
    more » « less
  4. Following Muga and van der Zee (Muga and van der Zee, 2015), we generalize the standard Discontinuous Petrov–Galerkin (DPG) method, based on Hilbert spaces, to Banach spaces. Numerical experiments using model 1D convection-dominated diffusionproblem are performed and compared with Hilbert setting. It is shown that Banach basedmethod gives solutions less susceptible to Gibbs phenomenon. h-adaptivity is implemented with the help of the error representation function as error indicator. 
    more » « less
  5. In this paper, we apply the self-attention from the state-of-the-art Transformer in Attention Is All You Need for the first time to a data-driven operator learning problem related to partial differential equations. An effort is put together to explain the heuristics of, and to improve the efficacy of the attention mechanism. By employing the operator approximation theory in Hilbert spaces, it is demonstrated for the first time that the softmax normalization in the scaled dot-product attention is sufficient but not necessary. Without softmax, the approximation capacity of a linearized Transformer variant can be proved to be comparable to a Petrov-Galerkin projection layer-wise, and the estimate is independent with respect to the sequence length. A new layer normalization scheme mimicking the Petrov-Galerkin projection is proposed to allow a scaling to propagate through attention layers, which helps the model achieve remarkable accuracy in operator learning tasks with unnormalized data. Finally, we present three operator learning experiments, including the viscid Burgers' equation, an interface Darcy flow, and an inverse interface coefficient identification problem. The newly proposed simple attention-based operator learner, Galerkin Transformer, shows significant improvements in both training cost and evaluation accuracy over its softmax-normalized counterparts. 
    more » « less