russtedrake PRO
Roboticist at MIT and TRI
Russ Tedrake
Galileo, Kepler, Newton, Coulomb, Hooke...
were data scientists.
They fit very simple models to very noisy data.
Gave us a rich class of parametric models that we could fit to new data.
What if Newton had deep learning...?
Galileo's notes on projectile motion
System
State-space
Auto-regressive (eg. ARMAX)
input
output
state
noise/disturbances
parameters
System
Auto-regressive (eg. ARMAX)
Lagrangian mechanics,
Recurrent neural networks (e.g. LSTM), ...
Feed-forward networks (e.g. \(y_n\)= image)
input
output
State-space
input
cost-to-go
Q-functions and value functions are models, too. They try to predict only one output (the cost-to-go)
\[ Q^{\pi}(n, x_n,u_n, \theta) \]
Today's discussion is about model class:
Should we prefer writing \(f\) and \(g\) using physics or deep networks?
Maybe not so different from
Our physics models are (and have always been) differentiable.
You don't need neural networks. Just the chain rule.
"All models are wrong, but some are useful" -- George Box
e.g., for
What makes a model useful?
What makes a model useful?
State-space models tend to be more efficient/compact, but require state estimation.
vs.
Auto-regressive
State-space
Perhaps the biggest philosophical difference between traditional physics models and "universal approximators".
The failings of our physics-based models are mostly due to the unreasonable burden of estimating the "Lagrangian state" and parameters.
For e.g. onions, laundry, peanut butter, ...
The failings of our deep models are mostly due to our inability to due efficient/reliable planning, control design and analysis.
I want the next Newton to come around and to work on onions, laundry, peanut butter...
By russtedrake
For participation in the panel of the IFRR Colloquium on the Roles of Physics-Based Models and Data-Driven Learning in Robotics http://ifrr.org/physics-based-data-driven-robotics