Motivation:
Taking this course provides an opportunity to build a strong foundation in Bayesian statistics, which is crucial for deep learning research due to its emphasis on uncertainty quantification and principled decision-making. By learning Bayesian methods, researchers can develop more robust and interpretable models that better reflect real-world scenarios. Bayesian techniques allow for the incorporation of prior knowledge, resulting in improved performance and generalization, especially in situations with limited data. As you embark on this Bayesian deep learning odyssey, you'll be crafting trailblazing models that master intricate challenges, propelling the AI revolution forward.
Ā
Table of Contents:
Ā
Week 1: Introduction to Bayesian Statistics
Day 1: Difference between frequentist and Bayesian statistics
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 1
- Online Resource: Frequentist vs. Bayesian Statistics Explained
Day 2: Bayes' theorem and conditional probability
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 2
- Online Resource: Bayesian Statistics Explained in Simple English
Day 3: Bayesian inference and Bayesian updating
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 3
- Online Resource: Bayesian Updating with Discrete Priors
Day 4: Prior and posterior distributions
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 4
- Online Resource: Picking Priors: A Guide for the Perplexed
Day 5: Conjugate priors and hierarchical models
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 5
- Online Resource: Conjugate Bayesian Analysis of the Gaussian Distribution
Day 6-7: Review and practice problems
- Reading: "Bayesian Data Analysis" by Gelman et al., review Chapters 1-5 and associated exercises
- Online Resource: Bayesian Inference - Examples and Exercises
Ā
Week 2: Bayesian Modeling
Day 1: Model selection and Bayesian model averaging
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 6
- Online Resource: Bayesian Model Averaging: A Tutorial
Day 2: Markov Chain Monte Carlo (MCMC) methods
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 11
- Online Resource: MCMC: Markov Chain Monte Carlo
Day 3: Gibbs sampling and Metropolis-Hastings algorithm
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 12
- Online Resource: Gibbs Sampling for Bayesian Inference
Day 4: Hamiltonian Monte Carlo (HMC) and No-U-Turn Sampler (NUTS)
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 14
- Online Resource: A Conceptual Introduction to Hamiltonian Monte Carlo
Day 5: Approximate Bayesian Computation (ABC)
- Reading: "Bayesian Data Analysis" by Gelman et al., Chapter 15
- Online Resource: Approximate Bayesian Computation: A Nonparametric Perspective
Day 6-7: Review and practice problems
- Reading: "Bayesian Data Analysis" by Gelman et al., review Chapters 6, 11, 12, 14, and 15, and associated exercises
- Online Resource: MCMC and Bayesian Modeling
Ā
Week 3: Bayesian Methods in Machine Learning
Day 1: Bayesian linear regression and Bayesian logistic regression
- Reading: "Pattern Recognition and Machine Learning" by Christopher Bishop, Chapter 3
- Online Resource: Bayesian Linear Regression Models with PyMC3
Day 2: Gaussian processes
- Reading: "Pattern Recognition and Machine Learning" by Christopher Bishop, Chapter 6
- Online Resource: Gaussian Processes for Dummies
Day 3: Bayesian neural networks
- Reading: "Pattern Recognition and Machine Learning" by Christopher Bishop, Chapter 5
- Online Resource: Bayesian Neural Networks in PyMC3
Day 4: Bayesian model selection and Occam's razor
- Reading: "Pattern Recognition and Machine Learning" by Christopher Bishop, Chapter 3.5
- Online Resource: Bayesian Model Selection
Day 5: Bayesian optimization
- Reading: "Practical Bayesian Optimization of Machine Learning Algorithms" by Jasper Snoek, Hugo Larochelle, and Ryan P. Adams
- Online Resource: Bayesian Optimization Explained
Day 6-7: Review and practice problems
- Review chapters and resources from Week 3
- Practice implementing the concepts in Python using libraries like PyMC3 and GPy
Ā
Week 4: Bayesian Methods in Deep Learning
Day 1: Introduction to variational inference
- Reading: "Pattern Recognition and Machine Learning" by Christopher Bishop, Chapter 10
- Online Resource: Variational Inference: A Review for Statisticians
Day 2: Mean-field variational inference
- Reading: "Pattern Recognition and Machine Learning" by Christopher Bishop, Chapter 10.1
- Online Resource: Mean Field Variational Inference
Day 3: Variational autoencoders (VAEs)
- Reading: "Auto-Encoding Variational Bayes" by Diederik P. Kingma and Max Welling
- Online Resource: Variational Autoencoders Explained
- Code Example: Variational Autoencoder in Keras
Day 4: Bayesian deep learning and uncertainty quantification
- Reading: "Bayesian Deep Learning" by Yarin Gal
- Online Resource: What My Deep Model Doesn't Know...
- Code Example: Bayesian Neural Networks with MC Dropout
Day 5: Advanced topics and applications of Bayesian deep learning
- Reading: "Deep and Hierarchical Implicit Models" by Dustin Tran, Rajesh Ranganath, and David Blei
- Online Resource: Bayesian Deep Learning Part II: Bridging PyMC3 and Lasagne to build a Hierarchical Neural Network
- Code Example: Bayesian CNN with PyMC3 and Lasagne
Day 6-7: Review and practice problems
- Review chapters and resources from Week 4
- Practice implementing the concepts in Python using libraries like TensorFlow, PyMC3, and Keras
Ā