Title Text

1. Differentiate

  • Fisher analysis/Accelerated MCMC through gradients (HMC, Variational Inference)
  • Gradient descent (fast optimization)
  • Differential equations

Finite differences: slow + error -> O(n) evaluations for n-dimensional gradient!

How to differentiate automatically?

f'(x)=\lim _{h\to 0}{\frac {f(x+h)-f(x)}{h}}

A) Finite differences

Expression swell: duplicated computation

How to differentiate automatically?

h(x) = f(x) g(x)

B) Symbolic differentiation

h'(x) = f'(x) g(x) + f(x) g'(x)

How to differentiate automatically?

C) Autodiff

def f(x1,x2):
	a = x1 / x2
	b = np.exp(x2)
	return (np.sin(a) + a - b) * (a-b)

2. Accelerate

  • XLA (special compiler) -> Accelerated Linear Algebra. Fuses operations to avoid writing intermediate results to memory
  • Run "numpy" on GPUs/TPUs
  • Jit (just-in-time compilation)  
  • pmap to execute code in parallel (even across GPUs)
  • vmap: automatic vectorization

deck

By carol cuesta

deck

  • 540