Skip to content

Course material for the PhD course in Advanced Bayesian Learning

Notifications You must be signed in to change notification settings

marek-chadim/AdvBayesLearnCourse

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

93 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Advanced Bayesian Learning - PhD course, 8 credits

Course information

The typical participant is a PhD student in Statistics or related fields (Mathematical Statistics, Engineering Science, Quantitative Finance, Computer Science, ...). The participants are expected to have taken a basic course in Bayesian methods, for example Bayesian Learning at Stockholm University, and to have some experience with programming.

Examination and Grades: The course is graded Pass or Fail. Examination is through individual reports on distributed problems for each topic. Many of the problems will require computer implementations of Bayesian learning algorithms.

Course organization The course is organized in four topics, each containing four lecture hours. Course participants will spend most of their study time by solving the problem sets for each topic on their own computers without supervision.

All lectures are given online using Zoom.

Welcome!

Mattias Villani
Professor of Statistics, Stockholm University


Topic 1 - Gaussian processes regression and classification

Reading: Gaussian Processes for Machine Learning - Chapters 1, 2.1-2.5, 3.1-3.4, 3.7, 4.1-4.3.
Code: GPML for Matlab | GPy for Python | Gausspr in R | Gaussianprocesses.jl in Julia | GPyTorch - GPs in PyTorch
Other material: Visualize GP kernels

Lecture 1 - May 3, hours 10-12
slides
Lecture 2 - May 3, hours 13-15
slides

Lab Topic 1
Problems | Lidar data


Topic 2 - Mixture models and Bayesian nonparametrics

Reading: Bayesian Data Analysis - Chapter 23 | The Neal (2000) article on MCMC for Dirichlet Process Mixtures

Widgets: Dirichlet distribution | Bayes for multinomial data | mixture of normals | mixture of Poissons

Lecture 3 - May 17, hours 10-12
slides
Lecture 4 - May 17, hours 13-15
slides | derivation marginal Gibbs

Lab Topic 2
Problems | Galaxy data


Topic 3 - Variational inference

Reading: Blei et al JASA | Tran's VI Notes
Other material: Kullback-Leibler widget | My recent talk with some VI | Natural gradient notes | autograd in python | ForwardDiff in Julia

Lecture 5 - May 31, hours 10-12
slides
Lecture 6 - May 31, hours 13-15
slides

Lab Topic 3
Problems | Time series data


Topic 4 - Bayesian regularization and variable selection

Reading: Sections 12.2-12.5 and 14.6 of Bayesian Learning book | Handbook chapter on variable selection | Article on Bayesian regularization

Lecture 7 - Sept 6, hours 10-12
slides
Lecture 8 - Sept 6, hours 13-15
slides

Lab Topic 4
Problems

About

Course material for the PhD course in Advanced Bayesian Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • PostScript 93.8%
  • Julia 4.8%
  • R 1.4%