We prove that the border rank of the Kronecker square of the little Coppersmith–Winograd tensor Tcw,q is the square of its border rank for q > 2 and that the border rank of its Kronecker cube is the cube of its border rank for q > 4. This answers questions raised implicitly by Coppersmith & Winograd (1990, §11) and explicitly by Bl¨aser (2013, Problem 9.8) and rules out the possibility of proving new upper bounds on the exponent of matrix multiplication using the square or cube of a little Coppersmith–Winograd tensor in this range. In the positive direction, we enlarge the list of explicit tensors potentially useful for Strassen’s laser method, introducing a skew-symmetric version of the Coppersmith– Winograd tensor, Tskewcw,q. For q = 2, the Kronecker square of this tensor coincides with the 3 × 3 determinant polynomial, det3 ∈ C9 ⊗ C9 ⊗ C9, regarded as a tensor. We show that this tensor could potentially be used to show that the exponent of matrix multiplication is two. We determine new upper bounds for the (Waring) rank and the (Waring) border rank of det3, exhibiting a strict submultiplicative behaviour for Tskewcw,2 which is promising for the laser method. We establish general results regarding border ranks of Kronecker powers of tensors, and make a detailed study of Kronecker squares of tensors in C3 ⊗ C3 ⊗ C3.
more »
« less
KRONECKER PRODUCT OF TENSORS AND HYPERGRAPHS, STRUCTURE AND DYNAMICS
Hypergraphs and tensors extend classic graph and matrix theories to account for multiway relationships, which are ubiquitous in engineering, biological, and social systems. While the Kronecker product is a potent tool for analyzing the coupling of systems in a graph or matrix context, its utility in studying multiway interactions, such as those represented by tensors and hypergraphs, remains elusive. In this article, we present a comprehensive exploration of algebraic, structural, and spectral properties of the tensor Kronecker product. We express Tucker and tensor train decompositions and various tensor eigenvalues in terms of the tensor Kronecker product. Additionally, we utilize the tensor Kronecker product to form Kronecker hypergraphs, which are tensor-based hypergraph products, and investigate the structure and stability of polynomial dynamics on Kronecker hypergraphs. Finally, we provide numerical examples to demonstrate the utility of the tensor Kronecker product in computing Z-eigenvalues, performing various tensor decompositions, and determining the stability of polynomial systems.
more »
« less
- Award ID(s):
- 2103026
- PAR ID:
- 10591717
- Publisher / Repository:
- SIAM
- Date Published:
- Journal Name:
- SIAM Journal Matrix Analysis and Applications
- Volume:
- 45
- Issue:
- 3
- ISSN:
- : 0895-4798
- Page Range / eLocation ID:
- 1621-1642
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We answer a question, posed implicitly in [P. Bürgisser et al., 1997] and explicitly in [M. Bläser, 2013], showing the border rank of the Kronecker square of the little Coppersmith-Winograd tensor is the square of the border rank of the tensor for all q>2, a negative result for complexity theory. We further show that when q>4, the analogous result holds for the Kronecker cube. In the positive direction, we enlarge the list of explicit tensors potentially useful for the laser method. We observe that a well-known tensor, the 3 × 3 determinant polynomial regarded as a tensor, det_3 ∈ C^9 ⊗ C^9 ⊗ C^9, could potentially be used in the laser method to prove the exponent of matrix multiplication is two. Because of this, we prove new upper bounds on its Waring rank and rank (both 18), border rank and Waring border rank (both 17), which, in addition to being promising for the laser method, are of interest in their own right. We discuss "skew" cousins of the little Coppersmith-Winograd tensor and indicate why they may be useful for the laser method. We establish general results regarding border ranks of Kronecker powers of tensors, and make a detailed study of Kronecker squares of tensors in C^3 ⊗ C^3 ⊗ C^3.more » « less
-
null (Ed.)This article provides an overview of tensors, their properties, and their applications in statistics. Tensors, also known as multidimensional arrays, are generalizations of matrices to higher orders and are useful data representation architectures. We first review basic tensor concepts and decompositions, and then we elaborate traditional and recent applications of tensors in the fields of recommender systems and imaging analysis. We also illustrate tensors for network data and explore the relations among interacting units in a complex network system. Some canonical tensor computational algorithms and available software libraries are provided for various tensor decompositions. Future research directions, including tensors in deep learning, are also discussed.more » « less
-
With the advent of machine learning and its overarching pervasiveness it is imperative to devise ways to represent large datasets efficiently while distilling intrinsic features necessary for subsequent analysis. The primary workhorse used in data dimensionality reduction and feature extraction has been the matrix singular value decomposition (SVD), which presupposes that data have been arranged in matrix format. A primary goal in this study is to show that high-dimensional datasets are more compressible when treated as tensors (i.e., multiway arrays) and compressed via tensor-SVDs under the tensor-tensor product constructs and its generalizations. We begin by proving Eckart–Young optimality results for families of tensor-SVDs under two different truncation strategies. Since such optimality properties can be proven in both matrix and tensor-based algebras, a fundamental question arises: Does the tensor construct subsume the matrix construct in terms of representation efficiency? The answer is positive, as proven by showing that a tensor-tensor representation of an equal dimensional spanning space can be superior to its matrix counterpart. We then use these optimality results to investigate how the compressed representation provided by the truncated tensor SVD is related both theoretically and empirically to its two closest tensor-based analogs, the truncated high-order SVD and the truncated tensor-train SVD.more » « less
-
The matricized-tensor times Khatri-Rao product (MTTKRP) is the computational bottleneck for algorithms computing CP decompositions of tensors. In this work, we develop shared-memory parallel algorithms for MTTKRP involving dense tensors. The algorithms cast nearly all of the computation as matrix operations in order to use optimized BLAS subroutines, and they avoid reordering tensor entries in memory. We use our parallel implementation to compute a CP decomposition of a neuroimaging data set and achieve a speedup of up to 7.4X over existing parallel software.more » « less
An official website of the United States government

