The backpropagation algorithm


Backpropagation is one of the several ways in which an artificial neural network (ANN) can be trained. It is a supervised training scheme, which means, it learns from labeled training data. In simple terms, BackProp is like “learning from mistakes. “The supervisor corrects the ANN whenever it makes mistakes.”

Initially, all the edge weights are randomly assigned. For every input in the training dataset, the ANN is activated, and its output is observed. This output is compared with the desired output that we already know, and the error is “propagated” back to the previous layer. This error is noted, and the weights are “adjusted” accordingly. This process is repeated until the output error is below a predetermined threshold.

What’s special about it, is the way the computations are carried out: each layer sees the error propagated backward into it as a ‘black box’ (a pre-computed value), which makes the computation in each layer a ‘local’ one, and therefore simplifies the whole process. This also gives way to implementing the algorithm in a computationally efficient way, which conceptually resembles the idea behind dynamic programming.

The following section describes the Backpropagation algorithm.

Assign all network inputs and output 
Initialize all weights with small random numbers, typically between -1 and 1 


    for every pattern in the training set 

        Present the pattern to the network 

//        Propagated the input forward through the network: 
            for each layer in the network  
                for every node in the layer  
                    1. Calculate the weight sum of the inputs to the node  
                    2. Add the threshold to the sum  
                    3. Calculate the activation for the node  

//        Propagate the errors backward through the network 
             for every node in the output layer  
                calculate the error signal  

            for all hidden layers  
                for every node in the layer  
                    1. Calculate the node's signal error  
                    2. Update each node's weight in the network  

//        Calculate Global Error 
          Calculate the Error Function 


while ((maximum  number of iterations < than specified) AND  
       (Error Function is > than specified))


Backpropagation for dummies


Disclaimer: The present content may not be used for training artificial intelligence or machine learning algorithms. All other uses, including search, entertainment, and commercial use, are permitted.