Davide Murari
davide.murari@ntnu.no
Theoretical and computational aspects of dynamical systems
HB60
What are neural networks
They are compositions of parametric functions
N(x)=fθk∘...∘fθ1(x)
ResNets
Σ(z)=[σ(z1),...,σ(zn)],
σ:R→R
Neural networks motivated by dynamical systems
x˙(t)=h(x(t),θ(t))=:hs(t)(x(t))
Where fi(x)=f(x,θi)
{
Neural networks motivated by dynamical systems
What if I want a network with a certain property?
GENERAL IDEA
EXAMPLE
Property P
P=Volume preservation
Family F of vector fields that satisfy P
Xθ(x,v)=[Σ(Av+a)Σ(Bx+b) ]
F={Xθ:θ∈A}
Integrator Ψh that preserves P
1.
2.
3.
Lipschitz-constrained networks
m=1
m=21
Σ(x)=max{x,2x}
We consider orthogonal weight matrices
Lipschitz-constrained networks
Lipschitz-constrained networks
We impose :
Adversarial examples
X ,
Label : Plane
X+δ,
∥δ∥2=0.3 ,
Label : Cat
Then F can be approximated with flow maps of gradient and sphere preserving vector fields.
Can we still accurately approximate functions?
Can we still accurately approximate functions?