Fast and scalable metamodelling of reservoir flows via machine learning techniques
Presenter:
Pavel Temirchev
Our team:
Dmirty Koroteev, Maxim Simonov, Denis Orlov,
Pavel Temirchev, Egor Illarionov, Dmitry Voloskov,
Ruslan Kostoev, Anna Gubanova, Mohammad Ebadi
Objectives and Tasks
Objectives:
- Create a fast and scalable reservoir model based on machine learning algorithms:
- 3D, 3-phase flow
- Arbitrary reservoir and wells' geometry
- Generalizable between reservoirs
- Develop history-matching and schedule optimization modules suitable for the model.
Tasks:
- Formulate, generate and calculate training data
- Develop load / store / dump framework for hydrodynamical models
- Develop the architecture of the ML reservoir model, implement several algorithms
- Train and test different models
- Develop History-Matching procedure as a gradient descent process
- Develop schedule optimization in the Reinforcement-Learning-like wayr the model.
Hydrodynamical Reservoir Simulation
Finite-Differences modelling
The standard approach (ECLIPSE, TNAVIGATOR, OPM-FLOW)
Time
initial reservoir state:
pore pressure, saturation fields
porosity, permeability, relative permeability and PVT tables
control applied on wells:
BHP, injection rates
The computational complexity depends on the number of computational cells (the complexity of matrix inversion)
Prior art
ROMs:
- Dynamic Mode Decomposition
P.J. Schmid (2010). “Dynamic mode decomposition of numerical and experimental data”
J.L. Proctor, S.L. Brunton, J.N. Kutz (2014). “Dynamic mode decomposition with control” - Galerkin-POD projection
T. Lassila, A. Manzoni, A. Quarteroni, G. Rozza (2014). “Model order reduction in fluid dynamics: challenges and perspectives”
S. Chaturantabut, D.C. Sorensen (2010). “Nonlinear model reduction via discrete empirical interpolation” - Deep Residual Recurrent Neural Networks
J.N. Kani, A.H. Elsheikh (2018). “Reduced order modeling of subsur-face multiphase flow models using deep residual recurrent neural networks” - Embed to Control
M. Watter, J. Springenberg, J. Boedecker, M. Riedmiller (2015). “Embed to control: A locally linear latent dynamics model for control from raw images”
Z.L Jin, Y. Liu, L.J. Durlofsky (2020). “Deep-learning-based surrogate model for reservoir simulation with time-varying well controls” - LSTM + Variational Autoencoder model
P. Temirchev, M. Simonov, R. Kostoev, E. Burnaev, I. Oseledets, A. Akhmetov, A. Margarit, A. Sit- nikov, D. Koroteev (2020). “Deep neural networks predicting oil movement in a development unit”
Supervised Machine Learning
"Cat"
"Cat"
"Dog"
"Giraffe"
Object
Target variable
The training set of reservoirs
Object - a reservoir
Target variable
- forecast of pore pressure and saturations (States)
- forecast of the production rates (Rates)
Problem: how to find the target variable for an object?
Solution: let us compute it on the commercial simulator (tNavigator).
- initial pore pressure and saturations (initial State)
- porosity and permeability (Rock)
- PVT, RPP (Tables)
- reservoir geometry (Grid)
- wells and their working schedule (Wells)
tNavigator
Expanding the training set using reservoir randomization
porosity
wells' control
Applying standard transformations during sampling
NDE-b-ROM metamodel:
Neural Differential Equations based Reduced Order Model
Time
ENCODER
DECODER
DECODING
ENCODING
DERIVATIVES
Conv3d_3x3, 8ch
Conv3d_3x3, 16ch, str=2
Conv3d_3x3, 32ch
Conv3d_3x3, 32ch,
str=2
Conv3d_3x3, 64ch
Conv3d_3x3, 64ch
Transp3d_3x3, 16ch
Transp3d_3x3, 8ch
Transp3d_3x3, 32ch, str=2
Transp3d_3x3, 32ch
Transp3d_3x3 64ch, str=2
Transp3d_3x3 64ch
VGG-like autoencoder, LINK
ReLU
Conv3d_3x3, 32ch
ReLU
Conv3d_3x3, 4ch
Neural Ordinary Differential Equations, LINK
Neural Networks training
Minimisation problem
\(\phi\) - a vector of neural network parameters
\(\hat{s}_{0:T}\) - the solution obtained as follows:
1. encoding:
2. latent space forecast:
3. decoding:
stochastic optimization
backpropagation + ADAM
a neural network
- Almost as fast as the standard convolutional layer
- Only active cells are stored in the memory
- Non-active cells participate the calculations only partially
- We can take the distances between cells into account by adding them into layer's weights
Convolutions on regular graphs
A | A | ||||||||
A | A | A | A | ||||||
A | A | A | A | A | A | ||||
A | A | A | A | A | A | A | A | ||
A | A | A | A | A | A | A | A | A | |
A | A | A | A | ||||||
A |
Computation of production rates
We use a fully physics-based approach for production rates calculation.
where
Tightly approximates
commercial solutions
oil field Х
Tightly approximates
commercial solutions
oil field Х: production rates
view from above
Average on the Z axis:
wells
Experimental results (on toy example)
tNavigator
NDE-b-ROM
Pressure
Oil saturation
Enhanced History Matching
Enhanced History Matching
Enhanced History Matching
Technical details
Well-known API
formatted files created either by hand or in model designer
RUNSPEC
GRID
PROPS
SOLUTION
SCHEDULE
.DATA
Can be used as:
-
python library
-
separate software with GUI
Supports CUDA and Pytorch computations
Provides output within diverse formats:
ECLIPSE binary
Pytorch \ numpy
other...
Computatuional time results
Compared with commercial simulator tNavigator
Time, sec
Model
NDE-b-ROM
tNavigator
1 GPU 20 sec
40 CPU 2400 sec
Tested on:
- large real oil reservoir
with more than 3.000.000 of active cells - non-linear well trajectories, fish-bones
- 40 years of simulation
Publications
Will participate SPE RPT Conference this year as a co-author
Conclusions
- We constructed the scalable reservoir model NDE-b-ROM based on Deep Learning techniques
- The speed up is around 100x times
- The framework for working with datasets was developed
- The API similar to conventional hydrodynamical models was developed
- History-matching and optimization procedures are under research
Serbia_meeting_8_12_20
By cydoroga
Serbia_meeting_8_12_20
- 485