Moving forward with the single neuron in mind, we can begin to understand a Multilayer Network. An Artificial Neural Network (ANN for short) constitutes of many of these single neurons put together in "layers", in order to categorize what type of process (task) those neurons will do. The following photo shows neurons working together in their respective layers: The Input layer, the Hidden layer, and the Output layer.
In the right photo, we can see how a single neuron resembles the behavior that a neuron in the human brain behaves. Each neuron receieves data from others, and information is passed along. An ANN behaves in this same way! Similar to the behavior of the human brain, many neurons working together is the cause for the "thinkative-power" that it produces.
But How? I'm glad you asked,
anonymous reader!
First Raw Data is given to our
Input Layer, which is passed on to our
Hidden Layer. The Hidden Layer multiplies the input by according weights.
Then The calculated Charge 'X' is
fed into our Activation Function,
and we get an output 'y'. That output 'y' is the value that is sent out from
the Hidden Layer and into the Output Layer (Which has weights of its own!).
Finally the output from the
Hidden Layer is compared to the expected
value (or desired value) 'Yd'. The Output Layer calculates error by the formula
Yd - y (Expected - actual output).
NOTE! That the ANN uses a different activation function known as
the Sigmoid function. The Sigmoid function is used here because, unlike
the Step and Sign funcs, Sigmoid is considered to be a Soft-Limiter function.
This is because the Sigmoid func has a "range" of truthness. It is NOT binary.