QuickSylver

Chris Liu

QE2 Talk 

Spring 2024

Fast solutions to Simultaneous Sylvester Systems

Outline

  • Simultaneous Sylvester System
  • Tensors and their contractions
  • QuickSylver
  • Future

Simultaneous Sylvester System

A \in \mathbb{R}^{s \times b \times c}\\ B \in \mathbb{R}^{a \times t \times c}\\ C \in \mathbb{R}^{a \times b \times c}
X \in \mathbb{R}^{a \times s}\\ Y \in \mathbb{R}^{t \times b}

Given

Find

Such that

(\forall i) \; XA_i + B_iY = C_i

Applications

Module endomorphisms

$$\operatorname{End}({}_AM) = \{X \mid (\forall i)XA_i = A_iX\}$$

 

Adjoint Algebra of a bilinear map

$$\operatorname{Adj}(*) = \{(X,Y) \mid (\forall u,v)Xu*v = u*Yv\}$$

Simultaneous Roth’s removal rule

(\exists X)(\forall i) \qquad \begin{bmatrix} I & X^{-1} \\ 0 & I \end{bmatrix} \begin{bmatrix} A_i & C_i \\ 0 & B_i \end{bmatrix} \begin{bmatrix} I & X \\ 0 & I \end{bmatrix} = \begin{bmatrix} A_i & 0\\ 0 & B_i \end{bmatrix}
O(n^6) \text{ to solve as a flattened linear system}

Interwoven striding in augmented matrix

\begin{bmatrix} A_1 \\ \vdots \\ A_b \end{bmatrix} \otimes I_a
\begin{bmatrix} B_1 \otimes I_b\\ \vdots\\ B_a \otimes I_b \end{bmatrix}

Tensors

\text{A tensor is an element of a tensor space}

A tensor space has the data of

  • Axes \(A\): Set
  • Frame \(V: A \rightarrow \text{Vect}(K) \)
  • Multi-linear interpreation $$ \langle - \mid - \rangle: T \rightarrow \prod_{a \in A} V_a \rightarrowtail \text{Vect}(K)$$

A cube of numbers as a tensor

\Gamma \in \mathbb{R}^{a \times b \times c}
\langle \Gamma \mid - \rangle: \mathbb{R}^a \times \mathbb{R}^b \times \mathbb{R}^c \rightarrowtail \mathbb{R}
\langle \Gamma \mid e_i,e_j,e_k\rangle = \Gamma_{ijk}
\langle \Gamma \mid u,v,w\rangle = \sum_{i,j,k} \Gamma_{ijk}u_iv_jw_k

Combining tensors via contractions

\langle M \mid_b N \rangle \text{ denotes the resulting tensor.}
M \in \mathbb{R}^{a \times \textcolor{red}{b}} \quad N \in \mathbb{R}^{\textcolor{red}{b} \times c}
(M\cdot N)_{ij} = \sum_{k=1}^{b} M_{i\textcolor{red}{k}}N_{\textcolor{red}{k}j}

Contract on multiple axes

\langle M \mid_{\textcolor{red}{a}\textcolor{blue}{b}} N \rangle \in \mathbb{R}^{c} \times \mathbb{R}^{d} \rightarrowtail \mathbb{R}
M \in \mathbb{R}^{\textcolor{red}{a} \times \textcolor{blue}{b} \times c}, N \in \mathbb{R}^{\textcolor{red}{a} \times \textcolor{blue}{b} \times d}
\langle M \mid_{\textcolor{red}{a}\textcolor{blue}{b}} N \rangle(e_i, e_j) = \sum_{\textcolor{red}{k},\textcolor{blue}{l}} M_{\textcolor{red}{k}\textcolor{blue}{l}i}N_{\textcolor{red}{k}\textcolor{blue}{l}j}

Tensors and their contractions as diagrams

Simultaneous Sylvester System

A \in \mathbb{R}^{s \times b \times c}\\ B \in \mathbb{R}^{a \times t \times c}\\ C \in \mathbb{R}^{a \times b \times c}
X \in \mathbb{R}^{a \times s}\\ Y \in \mathbb{R}^{t \times b}

Given

Find

Such that

(\forall i) \; XA_i + B_iY = C_i
(\exists X,Y) \qquad \langle X \mid_s \Lambda \rangle + \langle \Psi \mid_t Y \rangle = \Upsilon

Row Reduction

Mx = b
R \coloneqq (R_n \cdots R_1) \text{ where } RM = \operatorname{RREF}(M)

Given

Find

\text{Then } RMx = Rb \text{ is easy to solve for $x$.}

Idea: act opposite to variables

Face Reduction

\langle \Gamma \mid_a x \rangle = \Upsilon \qquad \Gamma \in \mathbb{R}^{a \times b \times c}
\Delta \text{ where } \langle \Delta \mid_{bc} \Gamma \rangle \text{ face reduced }

Given

Find

\text{Meaning } \langle \Delta \mid_{bc} \Gamma \mid_a x \rangle = \langle \Delta \mid_{bc} \Upsilon \rangle \\ \text{ is easy to solve for $x$.}

Idea: avoid the \( a \) axis

Flattened Matrix Perspective

\Gamma \in \mathbb{R}^{a\times b \times c}, \; y \in \mathbb{R}^{a \times b \times c} \\ (\Gamma^{(i)})_{jk} \coloneqq \Gamma_{kji}
\Gamma^{\flat} \coloneqq \begin{bmatrix} \Gamma^{(1)} \\ \vdots \\ \Gamma^{(c)} \end{bmatrix}
ab\text{-face reduce } \Gamma \coloneqq \text{Row reduce } \Gamma^{\flat}

Module Perspective

M \text{ a left } \mathbb{M}_{b}(\mathbb{R})\text{-mod generated by } \Gamma^{(1)}, \ldots, \Gamma^{(c)}
\text{Row reduce $\Gamma^{\flat}$} = \text{Find aligned generators for } M

Lemma: Size of minimum generating sets is well-defined

ab\text{-face reduce } \Gamma \text{ is to }\\ \text{find aligned generators for } M

Analogy: row rank

Shaded means face reduced

Example with data in matrix perspective

\Gamma \in \mathbb{R}^{3 \times 5 \times 2}, \quad \Gamma_{**1} = \begin{bmatrix} 1 & 0 & 2 & 0 & 6\\ 0 & 1 & 3 & 7 & 0\\ 0 & 0 & 0 & 0 & 0 \end{bmatrix}, \quad \Gamma_{**2} = \begin{bmatrix} 0 & 0 & 1 & 0 & 3\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 \end{bmatrix}
\Delta = \begin{bmatrix} 1 & 0 & 0 & -2 & 0 & 0\\ 0 & 1 & 0 & -3 & 0 & 0\\ 0 & 0 & 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 0 & 0 & 1 \end{bmatrix} \in \mathbb{M}_2(\mathbb{M}_3(\mathbb{R})), \text{ viewed as elements of } \mathbb{R}^{2 \times 2 \times 3 \times 3}
\langle \Gamma \mid \Delta \rangle \text{ is matrix multiplication, resulting in a face reduced tensor}

Challenge: face reducing tensors overlap

E in the way of F

Orthogonal idempotents of matricies

M = \begin{bmatrix}M_1 \\ M_2 \end{bmatrix}, \; \text{rowspan}(M) = \text{rowspan}(M_1)
(\exists e, 1-e), eM = \begin{bmatrix}M_1 \\ 0 \end{bmatrix},\; (1-e)M = \begin{bmatrix} 0 \\ M_2 \end{bmatrix}
(\exists T_1, T_2) \begin{bmatrix} T_1 & 0\\ T_2 & I \end{bmatrix} \begin{bmatrix}M_1\\ M_2 \end{bmatrix} = \operatorname{RREF}(M) \quad (\ast)
(\exists T_1, T_2) \begin{bmatrix} T_1 & 0\\ T_2 & I \end{bmatrix} \begin{bmatrix}M_1\\ M_2 \end{bmatrix} = \operatorname{RREF}(M) \quad (\ast)
eTeM = \operatorname{RREF}(M)\\ (1-e)TeM + (1-e)M = 0 \\ T(1-e) = (1-e)
T \coloneqq \begin{bmatrix} T_1 & 0\\ T_2 & I \end{bmatrix}
eT = \begin{bmatrix} T_1 & 0 \\ 0 & 0 \end{bmatrix}, \quad (1-e)T = \begin{bmatrix} 0 & 0\\ T_2 & I \end{bmatrix}
(1-e)TeM + (1-e)M = 0
T(1-e) = (1-e)

Matricies

Tensors

eTeM = \operatorname{RREF}(M)

size of \(e_+\) bound by minimum generating set size

size of \(e\) is rank of M

Flattened perspective on \( \Delta \)

E in the way

Problem

Craft face reducing tensors so they commute

Key Insight

Controlled tensors

Notation

Orthogonal Idempotents

M \otimes \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} + I_2 \otimes \begin{bmatrix} 0 & 0\\ 0 & 1 \end{bmatrix}
M = M \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 0 \end{bmatrix} + M \begin{bmatrix} 0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 1 \end{bmatrix}

Because both tensors equals to

Proof idea

Proof idea: expand out both definitions

Proof idea continued

Proof idea continued

Proof idea continued

QuickSylver

Simultaneous face reduction

Intuition from flattened matrix

Dense system after clearing

X backsub

Y backsub

E clearing A

F clearing B

Analyze result of contraction by region

Region with small number of variables

Key Lemma

\text{Matrix analogy: } (\forall x), 0 = (1-e)TeMx + (1-e)Mx
\text{Backsubstitution for rest of } X \text{ and } Y
\text{Theorem: $O(n^3)$ time to find a solution} \quad {\scriptstyle n = a+b+c+s+t}
sr_{\Psi} + tr_{\Lambda} \text{ number of variables}\\ {\scriptstyle \text{cubic factor} }

Solve linear system

Backsubstitution

s(a-r_{\Psi}) + t(b-r_{\Lambda}) \text{ contractions}\\ {\scriptstyle \text{linear factor}}

Implementing QuickSylver

Future: Derivations

A \in \mathbb{R}^{s \times b \times c}\\ B \in \mathbb{R}^{a \times t \times c}\\ C \in \mathbb{R}^{a \times b \times r}
X \in \mathbb{R}^{a \times s}\\ Y \in \mathbb{R}^{t \times b}\\ Z \in \mathbb{R}^{r \times c}

Given

Find

Such that

\langle X \mid_s A \rangle + \langle Y \mid_t B \rangle + \langle Z \mid_r C \rangle = 0

My Idea: face reducing tensors with the following properties to get commuting behavior for all 3 face reducing tensors

Challenges

  • The face reducing tensor \( \langle E \mid F \mid G \rangle \) splits into 8 pieces
  • Not clear what the module perspective should be
  • Flattened matrix intuition breaks down

References

QuickSylver - QE2 presentation

By Chris Liu

QuickSylver - QE2 presentation

QE2 Presentation delivered in Spring 2024

  • 86