Progress Report

Internship @ Yıldız Teknik Üniversitesi

by Srikote Naewchampa (Bamboo)

18/07/17

TensorFlow

Basic Components

  • Tensor
  • Computational Graph
  • Session
  • Placeholder
  • Variables

Neural Network

Linear Classifier = Single-Layer NN

Trying to Understand (Intuitively)

  • Each input in each neuron
  • Each input in all neurons
  • A batch of inputs in all neurons

Training

  • Cost/loss function
  • Step size or learning rate
  • Gradient Descent
  • Gradient Check
  • Back propagation
  • Hyperparameters

Activation Functions

  • Non-linearity
  • Sigmoid
  • ReLU (Recommended)
  • Others (Maxout, Tanh, ...)

Regularization

  • Prevent overfitting
  • L1 and L2
  • Dropout

"Use as many layers as possible, and use regularization to control overfitting"

Convolutional

Neural Network

Convolutional Layer

  • Extract features
  • Common hyperparameter values
  • Receptive field = 3
  • Stride = 1
  • Zero padding = 1

Pooling Layer

  • Pool outstanding features
  • Common hyperparameter values
  • Receptive field = 2
  • Stride = 2
  • Overlapping Pooling , F = 3, S = 2

Fully-Connected Layer

  • Looks like linear classifier
  • Fully connected
  • Convolutional Layer with ...
  • Receptive field = Origianl size

Layer Patterns

INPUT->[(CONV->RELU)*N ->POOL]*M->[FC->RELU]*K->FC

Common, N <= 3, K < 3

Suggestions

  • Try 3-channel input images
  • Focus on accuracy
  • And best practice
  • After that, performance

Thanks.

deck

By whcwhc78

deck

  • 408