RNN - LSTM/GRU Text Generation
source of everything: Martin Gorner, https://goo.gl/zTDm7D
#######################
# LE KERAS CODE - WOW #
#######################
# Around 60-ish lines of code
# Around 1000000 headaches
# 40 chars at a time, chars = total no of unique lowercase chars
model.add(LSTM(128, input_shape=(40, len(chars))))
# Softmax layer, just as many neurons as chars then get highest prob
model.add(Dense(len(chars)))
model.add(Activation('softmax'))
TMMDR: Use gates to transform shiite.