Householder-based algorithm for the general eigenvalue problem
Luis Manuel Román García
ITAM , 2018
Presentation Overview
- Problem description
- QZ traps and pitfalls
- The Hessenberg reduction step
- Conclusion
Problem description
What is a hard problem?
A quick glance through the contents of the average 60 papers per year in SIMAX shows that roughly 40% of the papers are associated with eigenvalue problem research, and it is likely that this holds more or less for the many papers per year that focus on numerical linear algebra.
A harder problem
The generalized eigenvalue problem
The generalized eigenvalue problem
Steps:
1.- Reduce B to upper triangular (QR)
2.- Reduce A to Hessenberg form
3.- Reduce A to quasitriangular form
4.- Compute the eigenvalues
5.- Compute the eigenvectors
QZ traps and pitfalls
QZ traps and pitfalls
1.- Infinite eigenvalues
2.- Vanishing entries
3.- Computationally intensive
The Hessenberg reduction step
Second order methods
Under some regularity assumptions, the best we can expect is super linear - quadratic convergence:
Best case scenario
In an online scenario regret grows O(log(T)):
Best case scenario
The multiple advantages of second order methods:
- Faster convergence rate
- Embarrassingly parallel
- Take into consideration curvature information
- Ideal for highly varying functions
- Higher per iteration cost
2010-2016
-
Martens is the first to successfully train a deep convolutional neural network with L-BFGS.
-
Sutskever successfully trains a recurrent neural network with a generalized Gauss-Newton algorithm
-
Bengio achieves state of the art results training recurrent networks with second order methods
Conclusion
Cache Friendly QZ
By Luis Roman
Cache Friendly QZ
- 797