A Brief Introduction
Google releases an open source version to the public
Designed with massive neural nets in mind
I start geeking out about machine learning after reading a blog post on recurrent neural networks.
I have no idea how any of this works, but it's exciting!
I still don't even know which questions to ask
Data Science guild is forming
I know a few things about TensorFlow
I'll share what I know and hand-wave the rest
How It Works™
Models are constructed as computational graphs
Nodes on the graph are "Ops"
Edges represent Tensors (i.e. typed multi-dimensional arrays)
Manages the environment for executing ops in a graph
Specifies which device executes part of all of a graph
Each session has one graph
Each graph may be run in multiple sessions
Building a Graph
Done in some "front-end" language
C++ (Yeah, right)
Running a Graph
Model inputs typically "fed" in via placeholder ops
Model outputs specified by "fetching" certain ops
Trained values as Variable ops which must be initialized before execution
Dataset containing images of handwritten digits. The classification problem of converting images to digits is a common benchmark for machine learning models.
1. Aggregate evidence in support of each class ("digit")
2. Convert the evidence into a likelihood for each given class
3. Train a set of weights to optimize the output probabilities.
Checked into the guild repo under tensorflow-mnist-examples
The TensorFlow project has a ton of additional docs and tutorials. This is based on the introductory tutorial.
- Open source community
- The API is pretty extensive and built to be extended
- Graphs structure make it easy to reason about models
- Designed for deep learning, but also very generalized
- Built for performance
- TensorBoard seems promising
- You still need to know what you're doing
- There is fairly large framework
- Separation of concerns seems challenging
Next Steps (for me)
- Find a slightly more practical model to build
- Try to scale up the training
- Learn to Python more better
TensorFlow: A Brief Introduction
By Jared Stilwell