top of page

Back Propagation

  • Writer: Admin
    Admin
  • May 28, 2021
  • 1 min read

Back Propagation is a techniques used in modeling neural network to update model parameters i.e. model weights and biases to make model to perform as expected


Steps followed in Backpropagation:


  1. Calculating Loss value for the model by using any kind of Divergence function such as L2 or KL divergence

  2. After calculating loss value we will then take derivative of loss w.r.t model parameters i.e. weights and biases

  3. Since a basic neural network contains Input layer, hidden layer, output layer so while we calculating loss w.r.t model parameters we have many layers in between where those layer parameters should also need to calculate

  4. After calculating intermediate derivatives of model parameters we obtain total derivative of loss w.r.t to model parameters





 
 
 

Comments


Recent Posts

© 2023 by Kathy Schulders. Proudly created with Wix.com 

  • Grey Twitter Icon
bottom of page