A sign pattern is an array with entries in $$\{+,-,0\}$$. A real matrix $$Q$$ is row orthogonal if $QQ^T = I$. The Strong Inner Product Property (SIPP), introduced in [B.A. Curtis and B.L. Shader, Sign patterns of orthogonal matrices and the strong inner product property, Linear Algebra Appl. 592: 228-259, 2020], is an important tool when determining whether a sign pattern allows row orthogonality because it guarantees there is a nearby matrix with the same property, allowing zero entries to be perturbed to nonzero entries, while preserving the sign of every nonzero entry. This paper uses the SIPP to initiate the study of conditions under which random sign patterns allow row orthogonality with high probability. Building on prior work, $$5\times n$$ nowhere zero sign patterns that minimally allow orthogonality are determined. Conditions on zero entries in a sign pattern are established that guarantee any row orthogonal matrix with such a sign pattern has the SIPP.
more »
« less
This content will become publicly available on January 1, 2026
The non-symmetric strong multiplicity property for sign patterns
We develop a non-symmetric strong multiplicity property for matrices that may or may not be symmetric. We say a sign pattern allows the non-symmetric strong multiplicity property if there is a matrix with the non-symmetric strong multiplicity property that has the given sign pattern. We show that this property of a matrix pattern preserves multiplicities of eigenvalues for superpatterns of the pattern. We also provide a bifurcation lemma, showing that a matrix pattern with the property also allows refinements of the multiplicity list of eigenvalues. We conclude by demonstrating how this property can help with the inverse eigenvalue problem of determining the number of distinct eigenvalues allowed by a sign pattern.
more »
« less
- Award ID(s):
- 1839918
- PAR ID:
- 10582824
- Publisher / Repository:
- International Linear Algebra Society
- Date Published:
- Journal Name:
- The Electronic Journal of Linear Algebra
- Volume:
- 41
- ISSN:
- 1081-3810
- Page Range / eLocation ID:
- 153 to 165
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Williamson's theorem states that for any 2n×2n real positive definite matrix A, there exists a 2n×2n real symplectic matrix S such that STAS=D⊕D, where D is an n×n diagonal matrix with positive diagonal entries which are known as the symplectic eigenvalues of A. Let H be any 2n×2n real symmetric matrix such that the perturbed matrix A+H is also positive definite. In this paper, we show that any symplectic matrix S̃ diagonalizing A+H in Williamson's theorem is of the form S̃ =SQ+(‖H‖), where Q is a 2n×2n real symplectic as well as orthogonal matrix. Moreover, Q is in symplectic block diagonal form with the block sizes given by twice the multiplicities of the symplectic eigenvalues of A. Consequently, we show that S̃ and S can be chosen so that ‖S̃ −S‖=(‖H‖). Our results hold even if A has repeated symplectic eigenvalues. This generalizes the stability result of symplectic matrices for non-repeated symplectic eigenvalues given by Idel, Gaona, and Wolf [Linear Algebra Appl., 525:45-58, 2017].more » « less
-
Abstract A linear principal minor polynomial or lpm polynomial is a linear combination of principal minors of a symmetric matrix. By restricting to the diagonal, lpm polynomials are in bijection with multiaffine polynomials. We show that this establishes a one-to-one correspondence between homogeneous multiaffine stable polynomials and PSD-stable lpm polynomials. This yields new construction techniques for hyperbolic polynomials and allows us to find an explicit degree 3 hyperbolic polynomial in six variables some of whose Rayleigh differences are not sums of squares. We further generalize the well-known Fisher–Hadamard and Koteljanskii inequalities from determinants to PSD-stable lpm polynomials. We investigate the relationship between the associated hyperbolicity cones and conjecture a relationship between the eigenvalues of a symmetric matrix and the values of certain lpm polynomials evaluated at that matrix. We refer to this relationship as spectral containment.more » « less
-
Abstract This work focuses on topology optimization formulations with linear buckling constraints wherein eigenvalues of arbitrary multiplicities can be canonically considered. The non‐differentiability of multiple eigenvalues is addressed by a mean value function which is a symmetric polynomial of the repeated eigenvalues in each cluster. This construction offers accurate control over each cluster of eigenvalues as compared to the aggregation functions such as ‐norm and Kreisselmeier–Steinhauser (K–S) function where only approximate maximum/minimum value is available. This also avoids the two‐loop optimization procedure required by the use of directional derivatives (Seyranian et al.Struct Optim. 1994;8(4):207‐227.). The spurious buckling modes issue is handled by two approaches—one with different interpolations on the initial stiffness and geometric stiffness and another with a pseudo‐mass matrix. Using the pseudo‐mass matrix, two new optimization formulations are proposed for incorporating buckling constraints together with the standard approach employing initial stiffness and geometric stiffness as two ingredients within generalized eigenvalue frameworks. Numerical results show that all three formulations can help to improve the stability of the optimized design. In addition, post‐nonlinear stability analysis on the optimized designs reveals that a higher linear buckling threshold might not lead to a higher nonlinear critical load, especially in cases when the pre‐critical response is nonlinear.more » « less
-
We introduce Non-Euclidean-MDS (Neuc-MDS), an extension of classical Multidimensional Scaling (MDS) that accommodates non-Euclidean and non-metric inputs. The main idea is to generalize the standard inner product to symmetric bilinear forms to utilize the negative eigenvalues of dissimilarity Gram matrices. Neuc-MDS efficiently optimizes the choice of (both positive and negative) eigenvalues of the dissimilarity Gram matrix to reduce STRESS, the sum of squared pairwise error. We provide an in-depth error analysis and proofs of the optimality in minimizing lower bounds of STRESS. We demonstrate Neuc-MDS’s ability to address limitations of classical MDS raised by prior research, and test it on various synthetic and real-world datasets in comparison with both linear and non-linear dimension reduction methods.more » « less