x1x_1

x2x_2

x3x_3

h2h_2

a3a_3

b2b_2 = [0.01,0.02,0.03]

b3b_3 = [0.01,0.02]

W2W_2

W3W_3

W1=W_1=

a2a_2

h1h_1

a1a_1

1.5

2.5

3

[0.050.050.050.050.050.050.050.050.05]\begin{bmatrix} 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ \end{bmatrix}
\begin{bmatrix} 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ \end{bmatrix}

W2=W_2=

[0.0250.0250.0250.0250.0250.0250.0250.0250.025]\begin{bmatrix} 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ \end{bmatrix}
\begin{bmatrix} 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ \end{bmatrix}

W3=W_3=

[111111]\begin{bmatrix} 1 & 1\\ 1 & 1\\ 1 & 1\\ \end{bmatrix}
\begin{bmatrix} 1 & 1\\ 1 & 1\\ 1 & 1\\ \end{bmatrix}

0.36

0.37

0.38

0.589

0.591

0.593

0.054

0.064

0.074

0.513

0.516

0.518

1.558

1.568

0.497

0.502

y^=h3\hat y = h_3

L(θ)=1Ni=1N(yilog(y^i)+(1yi)log(1y^i))\mathscr {L}(\theta) = -\frac{1}{N} \sum_{i=1}^N (y_ilog(\hat y_i)+(1-y_i)log(1- \hat y_i)) 0.6981

W1W_1

An Example for Backpropagation

"Forward Pass"

x=[1.5,2.5,3]x=[1.5, 2.5, 3]

b1b_1 = [0.01,0.02,0.03]

[h1]=sigmoid(a1)[h_1]=sigmoid(a_1)

[h2]=sigmoid(a2)[h_2]=sigmoid(a_2)

[h3]=softmax(a3)[h_3]=softmax(a_3)

[a1]=[1.5,2.5,3][a_1]=[1.5,2.5,3]*

[0.050.050.050.050.050.050.050.050.05]\begin{bmatrix} 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ \end{bmatrix}
\begin{bmatrix} 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ \end{bmatrix}

+[0.01,0.02,0.03]+ [0.01,0.02,0.03]

[a2]=[0.589,0.591,0.593][a_2]=[0.589,0.591,0.593]*

[0.0250.0250.0250.0250.0250.0250.0250.0250.025]\begin{bmatrix} 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ \end{bmatrix}
\begin{bmatrix} 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ \end{bmatrix}

+[0.01,0.02,0.03]+ [0.01,0.02,0.03]

[a3]=[0.513,0.516,0.518][a_3]=[0.513,0.516,0.518]*

[111111]\begin{bmatrix} 1 & 1 \\ 1 & 1 \\ 1 & 1 \\ \end{bmatrix}
\begin{bmatrix} 1 & 1 \\ 1 & 1 \\ 1 & 1 \\ \end{bmatrix}

+[0.01,0.02]+ [0.01,0.02]

y=[1,0]y=[1, 0]

"Binary Cross Entropy Loss"

x 1 x 2 x 3 h 2 a 3 b 2 = [0.01,0.02,0.03] b 3 = [0.01,0.02] W 2 W 3 W 1 = a 2 h 1 a 1 1.5 2.5 3 [ 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 ] \begin{bmatrix} 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ \end{bmatrix} W 2 = [ 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 ] \begin{bmatrix} 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ \end{bmatrix} W 3 = [ 1 1 1 1 1 1 ] \begin{bmatrix} 1 & 1\\ 1 & 1\\ 1 & 1\\ \end{bmatrix} 0.36 0.37 0.38 0.589 0.591 0.593 0.054 0.064 0.074 0.513 0.516 0.518 1.558 1.568 0.497 0.502 y ^ = h 3 L ( θ ) = − 1 N ∑ i = 1 N ( y i l o g ( y ^ i ) + ( 1 − y i ) l o g ( 1 − y ^ i ) ) 0.6981 W 1 An Example for Backpropagation "Forward Pass" x = [ 1 . 5 , 2 . 5 , 3 ] b 1 = [0.01,0.02,0.03] [ h 1 ] = s i g m o i d ( a 1 ) [ h 2 ] = s i g m o i d ( a 2 ) [ h 3 ] = s o f t m a x ( a 3 ) [ a 1 ] = [ 1 . 5 , 2 . 5 , 3 ] ∗ [ 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 0 . 0 5 ] \begin{bmatrix} 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ 0.05 & 0.05 & 0.05 \\ \end{bmatrix} + [ 0 . 0 1 , 0 . 0 2 , 0 . 0 3 ] [ a 2 ] = [ 0 . 5 8 9 , 0 . 5 9 1 , 0 . 5 9 3 ] ∗ [ 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 0 . 0 2 5 ] \begin{bmatrix} 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ 0.025 & 0.025 & 0.025 \\ \end{bmatrix} + [ 0 . 0 1 , 0 . 0 2 , 0 . 0 3 ] [ a 3 ] = [ 0 . 5 1 3 , 0 . 5 1 6 , 0 . 5 1 8 ] ∗ [ 1 1 1 1 1 1 ] \begin{bmatrix} 1 & 1 \\ 1 & 1 \\ 1 & 1 \\ \end{bmatrix} + [ 0 . 0 1 , 0 . 0 2 ] y = [ 1 , 0 ] "Binary Cross Entropy Loss"

Back propagation example

By Amrutha

Back propagation example

  • 288