Skip to content

Calculate weight gradients and update weights in a forward fashion such that the when updating layer X all the layers before it will not change further, hence not making the update obsolete

License

Notifications You must be signed in to change notification settings

Tensor-Reloaded/Backforward-Propagation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Backforward-Propagation

Calculate weight gradients and update weights in a forward fashion such that the when updating layer X all the layers before it will not change further, hence not making the update obsolete

This solution aims to completly resolve the Internal Covariate Shift problem of Deep Artificial Neural Networks.

About

Calculate weight gradients and update weights in a forward fashion such that the when updating layer X all the layers before it will not change further, hence not making the update obsolete

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages