Help me to understand "Backpropagation" in Neural Network




I have recently started learning Neural networks and find it more useful to solve more complex problems, those are more difficult to solve using traditional methods. It can be applied to any of the artificial intelligence, machine learning, parallel processing and other fields.

Recently, I read one of the concept related to neural network is “Backpropagation” and I have trouble with this concept to understand it better. Please help me to understand it better with good examples.




Generally supervised learning algorithms work by iterative error correction where at each iteration, the parameters of the algorithm are adjusted to reduce the error at the previous iteration. In case of neural networks, the backpropagation algorithm performs this error correction function. It helps in modifying the weights of the network, based on the error from the training data. For an example and a better and a more rigorous explanation, you could refer