James B. Wilson
Colorado State
University
Graphic Design by Ezra Wilson
> V1 := ... // A vector space
> V2 := ... // Another vector space
> BaseRing(V1) eq BaseRing(V2);
true
> Dimension(V1)
1
> Dimension(V2);
1
> IsIsomorphic(V1,V2);
false
Done in the flagship Computer Algebra System (CAS) Magma
Isomorphic Vector Spaces is an ancient core highly tested routine.
How does a bug like this happen? Is it a bug?
spotted as a problem in the 1960's, today it is being taken seriously in functional programming
public double f(double y) {
return y + Math.random();
}
f(2) != f(2) // almost always.
"side-effects" specifically hidden variables.
public int f(int y) {
if System.in.read() >= 97
return y-1;
else
return y+1;
}
f(2) != f(2) // if user types lower/Upper case
class Database {
int count;
public int uses() {
return ++count;
}
}
Database db = new Database();
db.uses() != db.uses()
Another "side-effect" a.k.a. mutate state/hidden outcome.
"In languages with no side-effects, like Haskell, we can substitute equals for equals"
-Referential Transparency
Wikipedia Oct. 22, 2019
module Main (main) where
data Floop = One | Two | Three
instance Eq Floop where
One == One = True
One == Two = False
One == Three = False
Two == One = False
Two == Two = True
Two == Three = True --- 2=3
Three == One = False
Three == Two = True --- 3=2
Three == Three = True
shuffle :: Floop -> Floop
shuffle One = Two
shuffle Two = Two --- fix 2
shuffle Three = One --- move 3
main = print ( (Two == Three) && (shuffle Two /= shuffle Three) )
--- prints "True" proving Haskell violates Leibniz Law
(Should someone correct Wikipedia?)
\(F\in\mathbb{M}_{2a\times 2b}(k)\) is \(K\)-linear if
\(X\in K\to (X\otimes I_{a}) F=F(X\otimes I_b)\)
Suffices to test on generators:
\(K=\left\langle I_2,\begin{bmatrix} 0 & 1\\ -1 & 0 \end{bmatrix}\right\rangle\)
\((I_2\otimes I_a)F=F(I_2\otimes I_b)\)
\(\left(\begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}\otimes I_a\right)F=F\left(\begin{bmatrix} 0 & 1\\ -1 & 0 \end{bmatrix}\otimes I_b\right)\)
Assume \(K/k\) is a quadratic field extension, e.g. \(\mathbb{C}/\mathbb{R}\) or \(\mathbb{F}_9/\mathbb{F}_3\).
Given: \(V_1,V_2\) K-vector spaces,
Find: K-linear maps \(V_1\to V_2\)
1. given by a \(2a\times 2b\)-matrix \(F\) over k.
2. where for each generator \(X\) of \(K\),
\((X\otimes I_a)F=F(X\otimes I_b).\)
3. Sets up a system of k-linear equations to solve for \(F\)'s.
4. Isomorphism selects \(F\) invertible (Ronyai, Ivanyos, Brooksbank-Luks).
Given a ring K generated by X, the K-module V is input by the image of X in \({\rm End}(V)\).
Concretely, \(V=k^a\) and we give
\(X_1,\ldots,X_m\in\mathbb{M}_a(k)\)
(a la 50 years of history)
Given: \(X_1,\ldots,X_m\in\mathbb{M}_{a}(k)\) and \(Y_1,\ldots,Y_m\in\mathbb{M}_{b}(k)\)
Find: F invertible and \((\forall i)(X_i F=FY_i)\)
Same algebra, different generators.
\(K_1=\left\langle I_2,\begin{bmatrix} 0 & 1\\ -1 & 0 \end{bmatrix}\right\rangle\)
\(K_2=\left\langle I_2,\begin{bmatrix} 1 & 1\\ -1 & 1 \end{bmatrix}\right\rangle\)
So \(K_1=K_2\) and every \(K_1\)-vector space is a \(K_2\)-vector space, but
\(\left(\begin{bmatrix} 0 & 1\\ -1 & 0\end{bmatrix}\otimes I_a\right)F=F\left(\begin{bmatrix} 1 & 1\\-1 & 1\end{bmatrix}\otimes I_b\right)\)
has no solutions!
(A form of Hidden Partition Problem.)
\(K_1=K_2\not\rightarrow {_{K_1} K_1}={_{K_2} K_2}\)
In fact in this model of modules these are not even comparable by isomorphism.
\(A,B: \mathbb{M}_{a\times b}(\mathbb{Z}/2[t])\)
Misunderstanding == at any level destroys test, e.g.:
Equality is tiered, often the mistake/limits were made deep down inside someone thing you don't control.
Order of basis matters. Both spaces were designed to use "lex-least", but one was right-to-left the other left-to-right.
(Real bug was with \(\mathbb{F}^n\wedge \mathbb{F}^n\) and the skew matrices.)
Fix? have to trace maps from domain -- slower, awkward, easy to do wrong.
By fixing coordinate bases \(\mathbb{F}^a\otimes \mathbb{F}^b=\langle e_i\otimes e_j \mid i\in [a], j\in [b]\rangle\) and \(\mathbb{M}_{a\times b}(\mathbb{F})=\langle E_{ij}\mid i\in [a],j\in [b]\rangle\) we identify these two maps.
But it failed! Why?
These are equal 1-dimensional vector spaces, but still non-isomorphic?! How!
Magma System
> C1 := ...; C2 := ...;
> C1 eq C2;
true
> IsIsomorphic(C1, C2);
false
Essential: Lets us substitute reliably
Motivated by contrapositive: \(x\neq y\) must have proof: \((\exists f)(f(x)\neq f(y)\). Constructivist need to be careful here.
Referential Transparency? See Below.
When serious ask real experts - Philosophers, start here:
https://www.iep.utm.edu/referential-opacity/
...and of course all this is predicated on understanding what a function is.
Lets assume \(\lambda\)-calculus or Combinators. In particular not Set Theory functions as we will need higher-order soon.
What?!
Functions are substitution rules.
\(\lambda x.M\) indicates what letter in \(M\) is to be substituted.
Some drift to notation \(x\mapsto M\)
\(M[5/x]\) result of substituting \(x\) with 5.
Care taken to avoid confusion with reuse of variable name.
No Domains, No codomains.
Fixed list of axiomatic primitive functions, usually
Functions are strings of letters from the primitive list.
No Domains, No codomains.
No formulas, instead the model for primitives gives meaning.
Functions are substitution rules.
Computation: \(\beta\)-reduction, i.e.
given a list of substitution rules,
rewrite into a normal form, e.g. left-most, outer-most.
Functions are strings of letters from the primitive list.
Computation: Given a model for the primitives, evaluate the primitives.
Canonical model: I, K, S can be given with \(\lambda\)-calculus.
Practical model: list of Intel/AMD etc. Chipset instructions.
Functions as substitutions/abstractions make sense even in high-order logic, e.g. functors between categories that are not small.
...and we will need types.
Simple Types (polymorphic Curry style):
Just a meta-theory of rules to impose on consistent type annotation. A model of these rules, say in \(\lambda\)-calculus, gives semantic meaning to "function".
Many other types needed
\(\prod_{a:A}B_a\)
\(A+B\)
W-types
Problem is the function \(\Gamma:K\mapsto {_K Mod}\) disobey's Leibniz
\(\Gamma:{\sf Field}\to {\sf Cat}\), i.e. a map of a category to a 2-category...a higher order function.
This universe problem remains somewhat unsettled, but we will ignore it for now.
== : (a,b:type) -> (x:a) -> (y:a) -> bool
x == x = true
_ == _ = false
Semantics: true if equal, false if not
== : {a,b:type} -> (x:a) -> (y:b ) -> type
MakeEq : (x:a) -> x==x
Semanitcs: x==y is the class of evidence accepted to believe x=y
== : {a,b:type} -> (x:a) -> (y:a) -> bool
== : {a,b:type} -> (x:a) -> (y:b ) -> type
Usage
if ( x == y ) ...
so at run time decide where to go
Usage
f : (K1:Ring) -> (K2:Ring) -> (proof:K1==K2) -> Module
can't even call f without equality, and the proof could be used to fashion the module consistantly
Need to transfer a proof:K1==K2 to a proof \({_{K1} K1}={_{K2} K2}\)
The relevant type family is
\(hom(_{K1} V1,_{K2} V2):type\)
which really means
\(hom:\prod_{K1,K2:Field}\prod_{p:K1=K2} type^{K-Vect\times K-Vect}\)
And we know
\(\hom(_K V_1, _K V_2)=\{F:\mathbb{M}_{a\times b}(k)\mid (\forall i)(X_iF=FX_i)\}\)
I.e. We have
\(C:\prod_{x,y:A}\prod_{p:x=y} type\)
and we know
\(c:C(x,x,Reflexive(x))\)
Martin-Lof Identity type says this inductively defines the whole type family, namely
\(\hom(_{K1} V1,_{K2} V2)=C(K1,K2,p:K1=K2)\)
is determined so as to agree when \(p=refl(K)\).
It is clear what it will do
\(\hom(_{K1} V1,_{K2} V2)=C(K1,K2,p:K1=K2)= \{ F\mid XF=Fp(X)\)
where \(p:K_1\to K_2\) is an explicit mapping.
Identity types are generalize to any equivalence relation, isomorphism.
Generalized Module Iso Problem.
Given: \(X_1,\ldots,X_m\in \mathbb{M}_a(k)\) and \(Y_1,\ldots,Y_n\in\mathbb{M}_b(k)\)
Find: F, \(F^{-1}\langle X_1,\ldots,X_m\rangle F=\langle Y_1,\ldots,Y_n\rangle\)
Thm. (Brooksbank-W.) Solved Module isomorphism with no distinguished generating set for rings that are cyclic. I.e. \(K\cong k[t]/(a(t))\).
Builds on work of Kayal, Lenstra, Kantor, and Brooksbank-Luks.
Coro. (Brooksbank-W.) Can now properly solve isomorphism of finite-dimensional vector spaces over finite fields and number fields.
What of other contexts, like Lie?
Thm. (Grochow) Conjugacy of Lie matrix algebras is Graph Isomophism hard.
Even the associative case is limited to cyclic rings and nontrivial.
\(x=y\to x\cong y\) is an isomorphism.
I.e. why make some isos "=" and other not? Seems arbitrary.
One reason: no great models of this yet, possibly one called "cubical" but very slow.
Axiom of "Univalence"
\(x=y\to x\cong y\) is an isomorphism.
Perhaps a distinction between the two that may matter: complexity!
Consider instead hierarchy. Start with judgemental =
\(x=y\hookrightarrow_P x\cong_P y\hookrightarrow_{NP} x\cong_{NP} y\hookrightarrow_{\Sigma_2} x\cong_{\Sigma_2} y... \)
This could axiomatically imply:
\(x\cong_{\Sigma_2} y\rightarrow_{\Sigma_2} x\cong_{NP} y\rightarrow x\cong_{P} y\rightarrow x=y\)
I.e. you have an axiom schema filtered through complexity
With a better type system the problem never exists, even Haskell's problem would vanish.
And this should all be done by the compiler,
but something else is missing.
We should want an identity package because...