Fully data-driven ROM approach
presentation is made by Pavel Temirchev
Assume we have a dynamical system:
Dynamics is either unknown
or given as a black-box
Dimensionality is huge
We are interested in:
We can obtain data from the process
Approximate the continuous dynamics by a discrete-time linear process:
We call a snapshot
Multidimensional data can be unravel into a vector of size
Let's collect the snapshot matrices as the data from the observed process:
Time
So, the dynamics became:
One can compute as follows:
where is a pseudo-inverse of
However,
Let us make Singular Value Decomposition of
(also known as Proper Orthogonal Decomposition in this snapshot-setting)
Crop up to first biggest singular values
Instead of computing
we can compute its projection on the low-rank POD basis:
So, now we are interested in predictions of future states:
But we can't do it with the truncated directly!
Luckily, we can find a way to not compute the full
Reminder:
Eigenvalue decomposition
Both and
can be computed
relatively easily
We can make predictions as follows
Element-wise exponentiation!
Eigenvectors are called Dynamic Modes
Dynamic modes (columns of ) represent spatio-temporal patterns of your data
You can plot them:
Eigenvalues can be used for the
stability analysis of a corresponding dynamic mode
We can plot them too:
if - mode is decaying
if - mode is stable
if - mode is growing
if - mode oscillates
No way I can represent wells in this model...
But you can
You need DMD with control:
property | Neural Network ROM | Dynamic Mode Decomposition | POD-Galerkin projection |
---|---|---|---|
Parametric dynamical systems | yes | not really | not really |
Can work with unknown dynamics | yes | yes | no |
Easy to implement | no | yes | not really |
Training speed | slow | moderate | moderate |
Scalability within spatial dimensionality | yes | no | no |
Interpretability | not really | yes | not really |