The application of graph Laplacian eigenvectors has been quite popular in the graph signal processing field: one can use them as ingredients to design smooth multiscale basis. Our long-term goal is to study and understand the dual geometry of graph Laplacian eigenvectors. In order to do that, it is necessary to define a certain metric to measure the behavioral differences between each pair of the eigenvectors. Saito (2018) considered the ramified optimal transportation (ROT) cost between the square of the eigenvectors as such a metric. Clonginger and Steinerberger (2018) proposed a way to measure the affinity (or `similarity') between the eigenvectors based on their Hadamard (HAD) product. In this article, we propose a simplified ROT metric that is more computational efficient and introduce two more ways to define the distance between the eigenvectors, i.e., the time-stepping diffusion (TSD) metric and the difference of absolute gradient (DAG) pseudometric. The TSD metric measures the cost of "flattening" the initial graph signal via diffusion process up to certain time, hence it can be viewed as a time-dependent version of the ROT metric. The DAG pseudometric is the l2-distance between the feature vectors derived from the eigenvectors, in particular, the absolute gradients of the eigenvectors. We then compare the performance of ROT, HAD and the two new "metrics: on different kinds of graphs. Finally, we investigate their relationship as well as their pros and cons. Keywords: Graph Laplacian eigenvectors, metrics between orthonormal vectors, dual geometry of graph Laplacian eigenvectors, multiscale basis dictionaries on graphs, heat diffusion on graphs, Wasserstein distance, optimal transport
more »
« less
Hadamard Diagonalizable Graphs of Order at Most 36
If the Laplacian matrix of a graph has a full set of orthogonal eigenvectors with entries $$\pm1$$, then the matrix formed by taking the columns as the eigenvectors is a Hadamard matrix and the graph is said to be Hadamard diagonalizable. In this article, we prove that if $n=8k+4$ the only possible Hadamard diagonalizable graphs are $$K_n$$, $$K_{n/2,n/2}$$, $$2K_{n/2}$$, and $$nK_1$$, and we develop a computational method for determining all graphs diagonalized by a given Hadamard matrix of any order. Using these two tools, we determine and present all Hadamard diagonalizable graphs up to order 36. Note that it is not even known how many Hadamard matrices there are of order 36.
more »
« less
- PAR ID:
- 10382202
- Date Published:
- Journal Name:
- The Electronic Journal of Combinatorics
- Volume:
- 29
- Issue:
- 2
- ISSN:
- 1077-8926
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Given any graph G G , the spread of G G is the maximum difference between any two eigenvalues of the adjacency matrix of G G . In this paper, we resolve a pair of 20-year-old conjectures of Gregory, Hershkowitz, and Kirkland regarding the spread of graphs. The first states that for all positive integers n n , the n n -vertex graph G G that maximizes spread is the join of a clique and an independent set, with ⌊ 2 n / 3 ⌋ \lfloor 2n/3 \rfloor and ⌈ n / 3 ⌉ \lceil n/3 \rceil vertices, respectively. Using techniques from the theory of graph limits and numerical analysis, we prove this claim for all n n sufficiently large. As an intermediate step, we prove an analogous result for a family of operators in the Hilbert space over L 2 [ 0 , 1 ] \mathscr {L}^2[0,1] . The second conjecture claims that for any fixed m ≤ n 2 / 4 m \leq n^2/4 , if G G maximizes spread over all n n -vertex graphs with m m edges, then G G is bipartite. We prove an asymptotic version of this conjecture. Furthermore, we construct an infinite family of counterexamples, which shows that our asymptotic solution is tight up to lower-order error terms.more » « less
-
We introduce SignNet and BasisNet---new neural architectures that are invariant to two key symmetries displayed by eigenvectors: (i) sign flips, since if v is an eigenvector then so is -v; and (ii) more general basis symmetries, which occur in higher dimensional eigenspaces with infinitely many choices of basis eigenvectors. We prove that under certain conditions our networks are universal, i.e., they can approximate any continuous function of eigenvectors with the desired invariances. When used with Laplacian eigenvectors, our networks are provably more expressive than existing spectral methods on graphs; for instance, they subsume all spectral graph convolutions, certain spectral graph invariants, and previously proposed graph positional encodings as special cases. Experiments show that our networks significantly outperform existing baselines on molecular graph regression, learning expressive graph representations, and learning neural fields on triangle meshes. Our code is available at https://github.com/cptq/SignNet-BasisNet.more » « less
-
The spectral decomposition of graph adjacency matrices is an essential ingredient in the design of graph signal processing (GSP) techniques. When the adjacency matrix has multi-dimensional eigenspaces, it is desirable to base GSP constructions on a particular eigenbasis that better reflects the graph’s symmetries. In this paper, we provide an explicit and detailed representation-theoretic account for the spectral decomposition of the adjacency matrix of a weighted Cayley graph. Our method applies to all weighted Cayley graphs, regardless of whether they are quasi-Abelian, and offers detailed descrip- tions of eigenvalues and eigenvectors derived from the coefficient functions of the representations of the underlying group. Next, we turn our attention to constructing frames on Cayley graphs. Frames are overcomplete spanning sets that ensure stable and potentially redundant systems for signal re- construction. We use our proposed eigenbases to build frames that are suitable for developing signal processing on Cayley graphs. These are the Frobenius–Schur frames and Cayley frames, for which we provide a characterization and a practical recipe for their construction.more » « less
-
Given a directed acyclic graph (DAG) G=(V,E), we say that G is (e,d)-depth-robust (resp. (e,d)-edge-depth-robust) if for any set S⊆V (resp. S⊆E) of at most |S|≤e nodes (resp. edges) the graph G−S contains a directed path of length d. While edge-depth-robust graphs are potentially easier to construct, many applications in cryptography require node depth-robust graphs with small indegree. We create a graph reduction that transforms an (e,d)-edge-depth-robust graph with m edges into a (e/2,d)-depth-robust graph with O(m) nodes and constant indegree. One immediate consequence of this result is the first construction of a provably (nloglognlogn,nlogn(logn)loglogn)-depth-robust graph with constant indegree. Our reduction crucially relies on ST-robust graphs, a new graph property we introduce which may be of independent interest. We say that a directed, acyclic graph with n inputs and n outputs is (k1,k2)-ST-robust if we can remove any k1 nodes and there exists a subgraph containing at least k2 inputs and k2 outputs such that each of the k2 inputs is connected to all of the k2 outputs. If the graph if (k1,n−k1)-ST-robust for all k1≤n we say that the graph is maximally ST-robust. We show how to construct maximally ST-robust graphs with constant indegree and O(n) nodes. Given a family M of ST-robust graphs and an arbitrary (e,d)-edge-depth-robust graph G we construct a new constant-indegree graph Reduce(G,M) by replacing each node in G with an ST-robust graph from M. We also show that ST-robust graphs can be used to construct (tight) proofs-of-space and (asymptotically) improved wide-block labeling functions.more » « less
An official website of the United States government

