Minor Project Presentation
Abhishek Kumar 2013ecs07
Department Of Computer Science and Engg.
Shri Mata Vaishno Devi University
****We have covered each section of the article in detail in the file submitted.
Oh Yes, Machine Learning is here
Yes, A big Yes
(simple neural network with 2 input and 5 hidden layers)
Each pixel in the image will be a feature (0 ~ 255)
Each image has three channels (R,G,B)
Each channel has 50*50 pixels
Total feature vector size = 50*50*3 = 7500
#of layers and # of neurons in each layer are hyperparameters
Need activate function to learn non-linear boundries
Choices for f : Sigmoid, Tanh,
Rectified Linear Unit (f(x) = max(0,x)) ...
The research work
Basically a logistic regression for multiple classes
Inputs are the hidden neurons of previous layers
Cost function is based on the output layer
Output 9 probabilities corresponding to 9 classes
Use back-propagation to calculate gradient and then update weights
Can be viewed as a feature extractor
inputs: 5x5 image and 3x3 filter
outputs: (5-3+1)x(5-3+1) = a 3x3 feature map
inputs: 3x50x50 image and k filters with size 5x5
outputs = ???
We will have 3*(50 - 5 + 1)^2*k many features
Need pooling to reduce the # of features
6,348*k
if k = 25, convolution generates around 160,000 features!
Also known as Sub-Sampling
For 2 x 2 pooling, suppose input = 4 x 4
Output = 2 x 2
2 convolution layers with pooling
1 hidden layer
1 output layer (logistic regression)
Input
First Covolution Layer with 25 filters
Second Covolution Layer with 50 filters
Inception: The lifeline of Google
Neural Network: The life line of Machine Learning
***In our case we used a Convolutional Neural networks
Working of Inception
A neural network from Scratch
And this is how it looked
Holla !!!!
Here is a screenshot of our model classifying a Circle with an accuracy of 99.999%
What is a Hyperparameters?
Adjust the weight by cost of classification and back-propagation