Kian Peymani
Computational Artificial Intelligence Course
Dr. Behrouz Minaee
l_in = lasagne.layers.InputLayer(shape=(None, 1, 30,30 )
# initializing weights with Glorot's scheme (which is the default anyway):
l_hid1 = lasagne.layers.DenseLayer(
l_in_drop, num_units=800,
nonlinearity=lasagne.nonlinearities.rectify,
W=lasagne.init.GlorotUniform())
l_hid1_drop = lasagne.layers.DropoutLayer(l_hid1, p=0.5)
l_hid2 = lasagne.layers.DenseLayer(
l_hid1_drop, num_units=800,
nonlinearity=lasagne.nonlinearities.rectify)
l_hid2_drop = lasagne.layers.DropoutLayer(l_hid2, p=0.5)
l_out = lasagne.layers.DenseLayer(
l_hid2_drop, num_units=CLASS_DIM,
nonlinearity=lasagne.nonlinearities.softmax)
MLP network example
200 epochs
4 hidden layers
1600 neuron/layer
79% Accuracy on Validation Data
81% Accuracy on Test Data
CNN network example
400 epochs
81% Accuracy on Validation Data
84% Accuracy on Test Data
MLP-DEPTH4-WIDTH400-69%
MLP-DEPTH4-WIDTH200-64%
MLP-DEPTH4-WIDTH800-79%
MLP-DEPTH4-WIDTH1600-80%
MLP-DEPTH2-WIDTH800-78%
MLP-DEPTH2-WIDTH1600-82%
only 200 epochs for each set
CNN-200EPOCHS-81%
CNN-400EPOCHS-86%
~ python mnist.py --gen
# will generate the entire Farsi alphabet with all fonts placed inside ./fonts
# a folder named `data` will be used for storing the image
# .npz files store the images as numpy arrays for faster loading
~ python mnist.py mlp 500
# Train and evaluate a mlp model for 500 epochs
~ python mnist.py cnn 500
# Train and evaluate a cnn model for 500 epochs
~ python mnist.py custom_mlp:4,200,.2,.5 200
# Train and evaluate a mlp with 4 hidden layers, 200 neurons per layer,
# .2 drop input and .5 drop_out ( input / output connection )