(c) Fjodor Van Veen, Asimov Institute
Input Cell
Kernel
Convolutional
Hidden Cell
Output Cell
Text
Text
=
The entire net repeated and interconnected
Single values (many-to-one)
Sequences (many-to-many)
Daytime? Weekday? Holiday? Weather? Events?
pos. correlation
neg. correlation
s1 | s2 | s3 | s4 | s5 | s6 | s7 | s8 | ... | |
---|---|---|---|---|---|---|---|---|---|
s1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | ... |
s2 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | ... |
s3 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | ... |
s4 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | ... |
s5 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | ... |
s6 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | ... |
s7 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | ... |
s8 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | ... |
... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
@tf_attributeLock
def prediction(self):
c.debugInfo(__name__,"Adding Prediction nodes to the graph")
with tf.name_scope('layer1'):
weights = tf.Variable(tf.truncated_normal(
(self.n_input,self.n_hidden),stddev=0.1
), name="lay1_weights")
bias = tf.Variable(tf.constant(
0.1,shape=[self.n_hidden]
), name = "lay1_bias")
out_layer1 = tf.nn.sigmoid(tf.matmul(
self.data,weights
)+bias, name = "lay1_output")
with tf.name_scope('layer2'):
weights = tf.Variable(tf.truncated_normal(
(self.n_hidden,self.n_output),stddev=0.1
), name="lay2_weights")
bias = tf.Variable(tf.constant(
0.1,shape=[self.n_output]
), name="lay2_bias")
out_layer2 = tf.nn.sigmoid(tf.matmul(
out_layer1,weights
)+bias, name = "lay2_output")
return out_layer2
Felix Kunde
Research Assistant, MAGDa
Beuth University of Applied Science
fkunde@beuth-hochschule.de