CSC412/2506 Winter 2023
Probabilistic Machine Learning
The language of probability allows us to coherently and automatically account for uncertainty. This course will teach you how to build, fit, and do inference in probabilistic models. These models let us generate novel images and text, find meaningful latent representations of data, take advantage of large unlabeled datasets, and even let us do analogical reasoning automatically. This course will teach the basic building blocks of these models and the computational tools needed to use them.
What you will learn
- Standard statistical learning algorithms, when to use them, and their limitations.
- The main elements of probabilistic models (distributions, expectations, latent variables, neural networks) and how to combine them.
- Standard computational tools (Monte Carlo, Stochastic optimization, regularization, automatic differentiation).
No required textbooks.
- (PRML) Christopher M. Bishop (2006) Pattern Recognition and Machine Learning
- (DL) Ian Goodfellow, Yoshua Bengio and Aaron Courville (2016), Deep Learning
- (MLPP) Kevin P. Murphy (2013), Machine Learning: A Probabilistic Perspective
- (PMLAT) Kevin P. Murphy (2023), Probabilistic Machine Learning: Advanced Topics
- (ESL) Trevor Hastie, Robert Tibshirani, Jerome Friedman (2009) The Elements of Statistical Learning
- (ISL) Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani (2017) Introduction to Statistical Learning
- (ITIL) David MacKay (2003) Information Theory, Inference, and Learning Algorithms
- Assignment 1 (13.3%) - Released Jan 23rd, Due Feb 5th 11:59PM
- Assignment 2 (13.3%) - Released Feb 6th, Due Feb 19th 11:59PM
- Midterm (20%) - During the Monday and Wednesday lectures Feb 27th, Mar 1st
- Assignment 3 (13.4%) - Released Mar 9th, Due Mar 27th 11:59PM
- Final (40%) - April 24th 9:00AM
Week 1 - Jan 9th - Course Overview and Graphical Model Notation
- Class Intro
- Topics covered
- Exponential Family
Week 2 - Graphical Model Notation, Decision Theory and Parametrizing Probabilistic Models
Week 3 - Latent variables and Exact Inference
Week 4 - Message Passing + Sampling
Week 5 - MCMC
Week 6 - Variational Inference
Week 7 - Feb 21st & 22nd - No classes - Reading week
Week 8 - Midterm Week
Week 9 - Stochastic Variational Inference and Variational Autoencoders
Week 10 - Embeddings
Week 11 - Kernel Methods, Attention
Week 12 - Gaussian Processes
Week 13 - Diffusion, Final Exam Review