We verify that a large portion of the theory of complex operator spaces and operator algebras (as represented by the 2004 book by the author and Le Merdy for specificity) transfers to the real case. We point out some of the results that do not work in the real case. We also discuss how the theory and standard constructions interact with the complexification, which is often as important, but sometimes much less obvious. For example, we develop the real case of the theory of operator space multipliers and the operator space centralizer algebra, and discuss how these topics connect with complexifi- cation. This turns out to differ in some important details from the complex case. We also characterize real structure in complex operator spaces and give ‘real’ characterizations of some of the most important objects in the subject.
more »
« less
Operator space entangling power of quantum dynamics and local operator entanglement growth in dual-unitary circuits
- Award ID(s):
- 2310227
- PAR ID:
- 10554776
- Publisher / Repository:
- American Physical Society
- Date Published:
- Journal Name:
- Physical Review A
- Volume:
- 110
- Issue:
- 5
- ISSN:
- 2469-9926; PLRAAN
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract We establish a theory of noncommutative (NC) functions on a class of von Neumann algebras with a particular direct sum property, e.g.,$$B({\mathcal H})$$. In contrast to the theory’s origins, we do not rely on appealing to results from the matricial case. We prove that the$$k{\mathrm {th}}$$directional derivative of any NC function at a scalar point is ak-linear homogeneous polynomial in its directions. Consequences include the fact that NC functions defined on domains containing scalar points can be uniformly approximated by free polynomials as well as realization formulas for NC functions bounded on particular sets, e.g., the NC polydisk and NC row ball.more » « less
-
Operator learning has become a powerful tool in machine learning for modeling complex physical systems governed by partial differential equations (PDEs). Although Deep Operator Networks (DeepONet) show promise, they require extensive data acquisition. Physics-informed DeepONets (PI-DeepONet) mitigate data scarcity but suffer from inefficient training processes. We introduce Separable Operator Networks (SepONet), a novel framework that significantly enhances the efficiency of physics-informed operator learning. SepONet uses independent trunk networks to learn basis functions separately for different coordinate axes, enabling faster and more memory-efficient training via forward-mode automatic differentiation. We provide a universal approximation theorem for SepONet proving the existence of a separable approximation to any nonlinear continuous operator. Then, we comprehensively benchmark its representational capacity and computational performance against PI-DeepONet. Our results demonstrate SepONet's superior performance across various nonlinear and inseparable PDEs, with SepONet's advantages increasing with problem complexity, dimension, and scale. For 1D time-dependent PDEs, SepONet achieves up to 112× faster training and 82× reduction in GPU memory usage compared to PI-DeepONet, while maintaining comparable accuracy. For the 2D time-dependent nonlinear diffusion equation, SepONet efficiently handles the complexity, achieving a 6.44\% mean relative $$\ell_{2}$$ test error, while PI-DeepONet fails due to memory constraints. This work paves the way for extreme-scale learning of continuous mappings between infinite-dimensional function spaces.more » « less
-
Operator learning has become a powerful tool in machine learning for modeling complex physical systems governed by partial differential equations (PDEs). Although Deep Operator Networks (DeepONet) show promise, they require extensive data acquisition. Physics-informed DeepONets (PI-DeepONet) mitigate data scarcity but suffer from inefficient training processes. We introduce Separable Operator Networks (SepONet), a novel framework that significantly enhances the efficiency of physics-informed operator learning. SepONet uses independent trunk networks to learn basis functions separately for different coordinate axes, enabling faster and more memory-efficient training via forward-mode automatic differentiation. We provide a universal approximation theorem for SepONet proving the existence of a separable approximation to any nonlinear continuous operator. Then, we comprehensively benchmark its representational capacity and computational performance against PI-DeepONet. Our results demonstrate SepONet's superior performance across various nonlinear and inseparable PDEs, with SepONet's advantages increasing with problem complexity, dimension, and scale. For 1D time-dependent PDEs, SepONet achieves up to 112× faster training and 82× reduction in GPU memory usage compared to PI-DeepONet, while maintaining comparable accuracy. For the 2D time-dependent nonlinear diffusion equation, SepONet efficiently handles the complexity, achieving a 6.44\% mean relative $$\el_2$$ test error, while PI-DeepONet fails due to memory constraints. This work paves the way for extreme-scale learning of continuous mappings between infinite-dimensional function spaces.more » « less
An official website of the United States government
