How does the error back propagation algorithm work?

Can someone clearly explain to me how the error back propagation algorithm works?

Author: 0xdb, 2017-11-05

1 answers

BackProp

  1. Direct propagation(FP) is considered. Standard: we summed up the input to each neuron of the layer, taking into account the weights, added the offset element, and activated it. And so on from layer to layer.
  2. The reverse propagation (BP) is that after point (1), errors are counted on all layers except the input, starting from the output and along the chain to the beginning. When calculating these errors, formulas with derivatives of the corresponding functions are used activations.
  3. The resulting errors are used to calculate the gradients, which are then used to change the weights.
 2
Author: sevnight, 2018-08-03 10:54:36