Google Classroom
GeoGebraGeoGebra Classroom

Artificial Neural Network Example

This illustrates how a neural network with two input, two hidden, and two output neurons implements the "forward pass" (a prediction) and the "backwards pass" (so-called backpropagation). The input neurons (training data), and , are on the left, and the output neurons (labels), and , are on the right. The biases, and , are attached to the hidden and output layer. The weights, , are attached to segments as appropriate, and you can adjust them with the sliders. The forward pass (and associated error) is calculated dynamically based on the inputs, weights, biases and activation logistics. Roughly speaking, information from the left moves "forward" to the right. When information flows through weights, . the function is affine, . Other left to right motion is either an activation logistic or error calculations between the calculated output, , and the target, , as noted by labels in the network. You need to take action to implement a "backward pass". Simply press "Backpropagate". When you do so, in the background a number of partial derivatives of with respect to the weights,, are calculated. These partial derivatives are then used to "navigate" to a slightly "better" selection of weights (in the sense that they reduce the error) by way of so-called gradient descent. Although the details are a little tricky (link to an explanation is below), the effect of this is to improve the calculated outputs, and so too, reduce , although probably just a little bit. Pay very close attention to this little bit of error reduction by way of backpropagation. Reduction in as a response to backpropagation is the whole point of an artificial neural network structure. This reduction in the error is the network "learning" how to select better weights to predict the outputs based on the given inputs. At first the results aren't that impressive, but upon iterative backpropagation, the network can learn a fairly good set of weights. In case it helps you think this through, the weight segments are color coded and thickened based on their values. Green indicates a positive weight, red is negative. Thicker is larger in absolute value. All the gory details of the arithmetic behind the forward pass and the backwards pass (and the partial derivatives!) can be found in this truly wonderful article: https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ If you download the source file and dig into it, all partial derivatives implementing backpropagation in this file are named with subscript convention in a way that matches the notation of the above article. Enjoy! If you want to talk more, feel free to email me at greg.petrics [at] northernvermont.edu