This Directory Contains the assignments for Session 6 (Backpropagation and Advanced Architectures) for the course ERA-V2.
File name:
- S6_BackProgation_AmitPalkar
- Brief Explanation of BackProgpation Algorithm
- The BackProgpogation Algorithm (also known as steepest gradient) is based on dynamically adapting the weights in the neural net by minimizing the total error square to zero. This is done by differentiation of the net Error (which is a function of all weights in the network) to zero. The differential is done is a partial differentiation (dabba).
- The learning rate (step size) determines the rate of convergence of the algorithm. Higher the step size higher is the rate of convergence however too high a step size may result in the algorithm not converging.
- To avoid the algorithm in getting in local minima, sometimes a small jitter (noise) is added for the algorithm to come out of local minima.