The equations \( p+q = r \) and \( p+q+r = s \) in \(U \otimes V \)

These slides are a record keeping tool and will not be used at for any actual presentations, hence the tiny font size and color choices at times

Results from analyzing p+q=r equation are found here

Given \(p+q=r \) equation in \(U \otimes V \)

Let \(A,B,C \leq U \) and \(X,Y,Z \leq V \) as subspaces

  • \(p \in A \otimes X \)
  • \( q \in B \otimes Y \)
  • \( r \in C \otimes Z \)

Each space of minimal dimension, satisfying \(p+q=r\)

One conclusion we may draw from the previous analysis is that there exists a compatible local decomposition for the equation \(p+q=r\)

This is the justification for being able to analyze \( p+q = r \) equations in a global-to-local fashion, such as the one arising from \(\text{Der}(s \otimes t) \) for \( s,t \) bimaps.

Summary of last time

Definition: [Compatible Local Decomposition] \((p,q,r)\) has a compatible local decomposition if there exists a \( n \in \mathbb{N} \), and subsets \( (a_i)_{i=1}^{n} \subset A\), \( (b_i) \subset B \), \( (c_i) \subset C \), \( (x_i) \subset X \), \( (y_i) \subset Y \), \( (z_i) \subset Z \) such that

  • (Local Equality) \( \forall i \in \{1,\ldots, n\} \), \( a_i \otimes x_i + b_i \otimes y_i = c_i \otimes z_i \)
  • (Decomposition Equality) \( \sum_i a_i \otimes x_i = p, \sum_i b_i \otimes y_i = q, \sum_i c_i \otimes z_i = r \)

I've implemented this decomposition in Magma when elements of \(U \otimes V \) are identified as matrices. Specifically,

 

Given a triple of matrices \( p,q,r \)

  • Compute spaces \( A,B,C,X,Y,Z \), and spaces \( U_1,\ldots, U_6, V_1, \ldots, V_6\)
  • Write change of basis matrices from original basis to one aligned with the decomposition \(A+B = U_1 \oplus \cdots \oplus U_6 \) and \(X+Y = V_1 \oplus \cdots \oplus V_6 \)
    • Concretely, \( M \) has columns corresponding to bases of \(U_1,\ldots, U_6\), \( N \) has rows corresponding to bases of \(V_1, \ldots, V_6\)
    • Let \( p' = M^{-1} pN^{-1} \) be the matrix with coordinates adapted to the above basis. \(p'\) has nonzero entries only certain positions as outlined here
    • This decomposes \(p\) as the sum of outer products of bases of \(U\) and \(V\), scaled by coefficients given in \(p'\): 
      • \( p = M p' N\)
      • \( = M (\sum_{i,j} p'_{ij} E_{ij}) N\)
      • \( = \sum_{i,j} p'_{ij} (M E_{ij} N) \)
      • \( = \sum_{i,j} p'_{ij} (M e_i e_j^T N) \)
      • \( = \sum_{i,j} p'_{ij} (M e_i) (e_j^T N) \)
    • This final equality says \(p'_{ij}\) is the coefficient of the \(i\)th column of \(M\) with the \(j\)th row of \(N\) in a basis expansion of \(p\). We now refer to the picture of where the nonzero entries of \(p'_{ij}\) is... (see next slide)
V1 V2 V3 V4 V5 V6
U1
U2
U3
U4
U5
U6
V1 V2 V3 V4 V5 V6
U1
U2
U3
U4
U5
U6
V1 V2 V3 V4 V5 V6
U1
U2
U3
U4
U5
U6

=

+

p

q

r

Each color is a different region and each region admits rank 1 expansions \(a \otimes x + b \otimes y = c \otimes z\) where each term is in the correct spaces. 

Example: To get rank 1 equalities for the orange region, we expand for each basis element of \(u \in U_1\) the corresponding nonzero columns in \(p,q,r\) in the \(V\) basis. Since \(u \in U_1\), \(u \in A \cap B \cap C\).

We get local equality \(u \otimes x + u \otimes y = u \otimes z\), where \(x = p_{12}v_2 + p_{13}v_3 + p_{15}v_5 \in X \), \(y = p_{12}v_2 + p_{14}v_4 + p_{16}v_6 \in Y \) and \(z = \sum_{i=2}^{6} p_{1i}v_i \in Z \)

 

Hence \(u \otimes x \in A \otimes X\), \(u \otimes y \in B \otimes Y\), and \(u \otimes z \in C \otimes Z\)

In the process of linking this to derivations of tensors in Magma - pretty close to complete

The upshot is that the existence of a compatible local decomposition implies for tensors \(s: U_2 \times U_1 \rightarrowtail U_0\) and \(t: V_2 \times V_1 \rightarrowtail V_0\), the deriviation algebra of the tensor \(s \otimes t: (U_2 \otimes V_2) \times (U_1 \otimes V_1) \rightarrowtail (U_0 \otimes V_0)\) has a derivation algebra that is completely characterized by algebraic invariants of the tensors \(s\) and \(t\).

 

In particular, knowledge of \(\text{Der}(s), \text{Der}(t), \text{Nuc}_{ij}(s), \text{Nuc}_{ij}(t)\) gives knowledge of \( \text{Der}(s \otimes t) \).

 

The proof is constructive, giving a decomposition of an arbitrary element in \( \text{Der}(s \otimes t)\) in terms of the individual pieces

Given \(p+q+r =s \) equation in \(U \otimes V \)

Let \(A,B,C,D \leq U \) and \(X,Y,Z,W \leq V \) as subspaces

Given elements

  • \(p \in A \otimes X \)
  • \( q \in B \otimes Y \)
  • \( r \in C \otimes Z \)
  • \(s \in D \otimes W \)

Each space of minimal dimension, satisfying \(p+q+r = s\)

Claim:

There exists a quadruple \( (p,q,r,s) \) satisfying \(p+q+r=s\) but not having a compatible local decomposition.

Definition: [Compatible Local Decomposition] \((p,q,r,s)\) has a compatible local decomposition if there exists \(n \in \mathbb{N}\), and subsets \( (a_i)_{i=1}^{n} \subset A, (b_i) \subset B, \ldots \) such that both of the following holds

  • (Local equality) \( \forall i, a_i \otimes x_i + b_i \otimes y_i + c_i \otimes z_i = d_i \otimes w_i \)
  • (Decomposition equality) \( p = \sum_i a_i \otimes x_i \), \( q = \sum_i b_i \otimes y_i \), \( r = \sum_i c_i \otimes z_i \), \( s = \sum_i d_i \otimes w_i \)

Observation:

To have a compatible local decomposition we require the vector space \( (A \otimes X + B \otimes Y + C \otimes Z) \cap (D \otimes W) \eqqcolon H \) to contain nonzero rank one elements since the right hand side of a local equality \(a_i \otimes x_i + b_i \otimes y_i + c_i \otimes z_i = d_i \otimes w_i \) is rank 1.

Thus to show no valid decompositions exist, we will construct an example where the vector space \(H\) have only nonzero elements of rank greater than 1. 

 

Below, we set up the spaces, \( A,B,C,D,X,Y,Z,W \), prove nonzero elements of \(H \) has rank greater than 1, and find a quadruple \( (p,q,r,s) \) in the appropriate spaces (\(A \otimes X\), \(B \otimes Y\), etc) satisfying \( p+q+r=s \)

Let \( k := \mathbb{F}_5 \), and \(U,V = k^4 \)

Let \( (e_1,\ldots,e_4)\) be the standard basis of \( U \) and \( (f_1,\ldots, f_4) \) be the standard basis of \( V \).

Below we'll produce some "magic constants" that have the desired properties, and then explain how we got them

\( D = \text{span}\{e_1,e_2\}, W = \text{span}\{f_1,f_2\} \)

\( A = \text{span}\{e_1 + e_3, e_2 + e_4 \}, B =\text{span}\{e_1 - e_3, e_2 - e_4 \}, C = \text{span}\{e_1 + e_4, e_2 + 4e_3+ 4e_4 \} \)

\( X = \text{span}\{f_1 + f_3, f_2 + f_4 \}, Y =\text{span}\{f_1 - f_3, f_2 - f_4 \}, Z = \text{span}\{f_1 + 4f_4, f_2 + f_3+ 4f_4 \} \)

As notation, we shall denote \(A = \text{span}\{e_1+e_3 \eqqcolon a_1, e_2+e_4 \eqqcolon a_2\}, B = \text{span}\{b_1,b_2\}\), and so on

View \(U\) as column vectors, \( V \) as row vectors, and \(U \otimes V \) as \(4 \times 4\) matrices where \(u \otimes v \) is the outer product of \(u\) and \(v\).

If \(\mathcal{U}\) is a basis of U and \(\mathcal{V}\) a basis of V, then \( \{u \otimes v | u \in \mathcal{U}, v \in \mathcal{V} \} \) is a basis of \(U \otimes V\)

As \(D =\text{span}\{e_1,e_2\}, W = \text{span}\{f_1,f_2\}\), we get \(D \otimes W = \text{span}\{e_1 \otimes f_1, e_1 \otimes f_2, e_2 \otimes f_1, e_2 \otimes f_2\}\), which is spanned by the below matrices

\left\langle \begin{bmatrix} 1&0&0&0\\ 0&0&0&0\\ 0&0&0&0\\ 0&0&0&0 \end{bmatrix}, \begin{bmatrix} 0&1&0&0\\ 0&0&0&0\\ 0&0&0&0\\ 0&0&0&0 \end{bmatrix}, \begin{bmatrix} 0&0&0&0\\ 1&0&0&0\\ 0&0&0&0\\ 0&0&0&0 \end{bmatrix}, \begin{bmatrix} 0&0&0&0\\ 0&1&0&0\\ 0&0&0&0\\ 0&0&0&0 \end{bmatrix} \right\rangle

Recall \( A \otimes X = \text{span} \{ a_1 \otimes x_1, a_1 \otimes x_2, a_2 \otimes x_1, a_2 \otimes x_2 \} \), where  \(a_1 = e_1 + e_3, a_2 = e_2 + e_4, x_1 = f_1 + f_3, x_2 = f_2 + f_4 \)

\left\langle \begin{bmatrix} 1&0&1&0\\ 0&0&0&0\\ 1&0&1&0\\ 0&0&0&0 \end{bmatrix}, \begin{bmatrix} 0&1&0&1\\ 0&0&0&0\\ 0&1&0&1\\ 0&0&0&0 \end{bmatrix}, \begin{bmatrix} 0&0&0&0\\ 1&0&1&0\\ 0&0&0&0\\ 1&0&1&0 \end{bmatrix}, \begin{bmatrix} 0&0&0&0\\ 0&1&0&1\\ 0&0&0&0\\ 0&1&0&1 \end{bmatrix} \right\rangle

More concisely, \( p \in A \otimes X \) is of the form \( \begin{bmatrix} P & P \\ P & P \end{bmatrix} \) for \(P\) an arbitrary \(2 \times 2\) matrix

Similarly, \( q \in B \otimes Y \) is of the form \( \begin{bmatrix} Q & -Q \\ -Q & Q \end{bmatrix} \) for \(Q\) an arbitrary \(2 \times 2\) matrix

The calculation for \(r \in C \otimes Z \) is a bit more complicated. See next slide.

More concisely, \(s \in D \otimes W\) is of the form \( \begin{bmatrix}S & 0 \\ 0 & 0 \end{bmatrix}\) for \(S\) an arbitrary \(2 \times 2 \) matrix

Hence \(A \otimes X\) is spanned by

\( C \otimes Z = \text{span} \{ c_1 \otimes z_1, c_1 \otimes z_2, c_2 \otimes z_1, c_2 \otimes z_2 \} \), where \(c_1 = e_1 + e_4, c_2 = e_2 + 4e_3 + 4e_4 \)

and \(z_1 = f_1 + 4f_4, z_2 = f_2 + f_3 + 4f_4 \)

Let \( J = \begin{bmatrix}0 & 4 \\ 1 & 4\end{bmatrix} \) describe a linear map \(D \rightarrow D'\). Then \(c_1 = e_1 + e_4 = e_1 + J(e_1), c_2 = e_2 + 4e_3 + 4e_4 = e_2 + J(e_2)\).

Let \(K = J^T = \begin{bmatrix} 0 & 1 \\ 4 & 4\end {bmatrix}\) describe a linear map \(W \rightarrow W' \). Then \(z_1 = f_1 + K(f_1), z_2 = f_2 + K(f_2) \) 

Recall, \(D = \text{span}\{e_1,e_2\} \), with complement \(D' = \text{span}\{e_3,e_4\}\), where \(U = D \oplus D' \)

Similarly, \(W = \text{span}\{f_1,f_2\}\), with complement \(W' = \text{span}\{f_3,f_4\} \).

Expanding \(c_1 \otimes z_1 = (e_1 + J(e_1)) \otimes (f_1 + K(f_1)) = e_1 \otimes f_1 + e_1 \otimes K(f_1) + J(e_1) \otimes f_1 + J(e_1) \otimes K(f_1) \).

The 4 terms are in 4 distinct regions in the block \(2 \times 2\) matrix with coordinates in \( \begin{bmatrix} D \otimes W & D \otimes W' \\ D' \otimes W & D' \otimes W' \end{bmatrix} \) because \(e_1 \otimes f_1 \in D \otimes W\), \(e_1 \otimes K(f_1) \in D \otimes W'\), \(J(e_1) \otimes f_1 \in D' \otimes W\), and \(J(e_1) \otimes K(f_1) \in D' \otimes W'\)

Converting to coordinates where \(e_i\) are column vectors and \(f_i\) are row vectors. Then \(J\) acts on the left, and \(K^T = J\) acts on the right (as \(K\) was originally defined with column vector conventions)

By linearity, the block configuration of an arbitrary element \(r \in C \otimes Z \) is \( \begin{bmatrix} R & RJ \\ JR & JRJ \end{bmatrix}\)

Explicitly, \(c_1 \otimes z_1\) is \( \begin{bmatrix} e_1 \otimes f_1 & e_1 \otimes K(f_1)\\ J(e_1) \otimes f_1 & J(e_1) \otimes K(f_1) \end{bmatrix} \)

Now an element in \(H := (A \otimes X + B \otimes Y + C \otimes Z) \cap (D \otimes W) \) must satisfy

\begin{bmatrix} P & P \\ P & P \end{bmatrix} + \begin{bmatrix} Q & -Q \\ -Q & Q \end{bmatrix} + \begin{bmatrix} R & RJ \\ JR & JRJ \end{bmatrix} = \begin{bmatrix} S & 0 \\ 0 & 0 \end{bmatrix}

With each of \(P,Q,R,S\) arbitrary \(2 \times 2\) matrices. This gives 4 equations, with unknowns \(P,Q,R,S\)

\begin{bmatrix} P+Q+R = S & P-Q+RJ = 0 \\ P-Q+JR = 0 & P+Q+JRJ = 0 \end{bmatrix}
  • Subtracting the bottom-left equation from the top-right equation, we get \(RJ - JR = 0\) meaning \(R\) commutes with \(J \)
    • We chose \(J\) with irreducible minimal polynomial. This implies all elements commuting with \(J\) are in \(k[J]\), which is a field. Hence all nonzero elements are invertible.
  • If \(RJ = JR\), then \( JRJ = RJ^2 \). 
    • Using the top-right and bottom-right equations, \(P-Q = -RJ, P+Q = -RJ^2 \) means \(2P = -(RJ + RJ^2)\) and \(2Q = RJ - RJ^2\)
    • In characteristic \(\neq 2\) this means \( R \) also determines \(P\) and \(Q\)
  • The top-left equality \(P+Q+R = S\) with the above formula simplifies to \(R - RJ^2 = S\). This factors as \( S = (I_2-J^2)(R)\). When \( I_2 - J^2\) is invertible, this means \(R \) and \(S \) differ by an invertible matrix
  • We choose any nonzero \(R \in k[J]\), so \(R\) is full rank. Above concludes \(S\) is full rank too. In particular, \(S \neq d \otimes w \) for any choice of \(d \in D, w \in W\), so the intersection has no nonzero rank 1 elements

Next we solve for a \( (p,q,r,s) \) quadruple satisfying \(p+q+r=s\) 

Translating back to elements of \(U \otimes V\), a valid \( p,q,r,s \) is the following:

  • \( p = a_1 \otimes x_1 + a_1 \otimes x_2 + 4 a_2 \otimes x_1 + 2 a_2 \otimes x_2 \)
  • \( q = 3 b_1 \otimes y_1 + 2 b_1 \otimes y_2 + 3 b_2 \otimes y_1\)
  • \( r = 2 c_1 \otimes z_1  + 2 c_1 \otimes z_2 + 3c_2 \otimes z_1 + 4 c_2 \otimes z_2 \)
  • \( s = e_1 \otimes f_1 + e_2 \otimes f_2 \)

This quadruple satisfies \(p+q+r = s\), with \(p \in A \otimes X, q \in B \otimes Y, r \in C \otimes Z, s \in D \otimes W \), and no local equality exists because any nonzero element of \((A \otimes X + B \otimes Y + C \otimes Z) \cap (D \otimes W)\) is rank 2.

Recall \(D \otimes W  = \text{span}\{e_1 \otimes f_1, e_1 \otimes f_2, e_2 \otimes f_1, e_2 \otimes f_2\} \). We now solve for an element in \(H := (A \otimes X + B \otimes Y + C \otimes Z) \cap (D \otimes W) \) which gives the \( (p,q,r,s) \) counterexample.

Confirmation: Magma calculation (i.e paste here to here) concludes \(S\) is spanned by the following matrices. Notice every nonzero element in the span has rank 2.

\left\langle \begin{bmatrix} 1&0&0&0\\ 0&1&0&0\\ 0&0&0&0\\ 0&0&0&0 \end{bmatrix}, \begin{bmatrix} 0&1&0&0\\ 4&1&0&0\\ 0&0&0&0\\ 0&0&0&0 \end{bmatrix} \right\rangle

Recall

  • \( S = (I_2-J^2)(R) \)
  • \( 2P = -(RJ + RJ^2) \)
  • \(2Q = RJ - RJ^2 \)

Let \(S = e_1 \otimes f_1 + e_2 \otimes f_2 \) - it corresponds to \(I_2\). We solve for \(P,Q,R\)

  • \(P = -\frac{1}{2}(RJ + RJ^2) = \begin{bmatrix}1 & 1 \\ 4 & 2\end{bmatrix}\)
  • \(Q = \frac{1}{2}(RJ - RJ^2) = \begin{bmatrix} 3 & 2 \\ 3 & 0 \end{bmatrix}\)
  • \(R = S(I_2-J^2)^{-1} = \begin{bmatrix}2 & 2 \\ 3 & 4\end{bmatrix}\)

What's proven: Unlike the equation \(p+q=r\) in \(U \otimes V\), the equation \(p+q+r=s\) need not admit a compatible local decomposition. We've given an explicit counterexample above.

pqr-and-pqrs

By Chris Liu

pqr-and-pqrs

  • 24