Skip to content

This repository includes some detailed proofs of "Bias Variance Decomposition for KL Divergence".

License

Notifications You must be signed in to change notification settings

HolmesShuan/Bias-Variance-Decomposition-for-KL-Divergence

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bias-Variance-Decomposition-for-KL-Divergence

This repository includes some detailed proofs of "Bias Variance Decomposition for KL Divergence". Hopefully, it will be helpful for a better understanding of Heskes's paper and recent ICLR paper "Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance Tradeoff Perspective". It also serves as a brief answer to the Exercise 7.3 in EE2211.

Please feel free to correct me if my understanding of any of the aspects was wrong.

image image image

Reference:

  1. Briefly answered in math.stackexchange.com
  2. Paper: Information-Theoretic Variable Selection and Network Inference from Microarray Data
  3. Paper: Bias/variance decompositions for likelihood-based estimators
  4. Book: Notes for EE2211 Introduction to Machine Learning

Cite:

If you find this repo useful, please cite

@misc{proof4biasvariance,
  author =       {Shuan},
  title =        {Bias-Variance-Decomposition-for-KL-Divergence},
  howpublished = {\url{https://github.com/HolmesShuan/Bias-Variance-Decomposition-for-KL-Divergence}},
  year =         {2021}
}

About

This repository includes some detailed proofs of "Bias Variance Decomposition for KL Divergence".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published