Lecture notes and videos

  1. Introduction to Statistical Computing and Probability and Statistics
  2. Sum and Product Rules, Conditional Probability, Independence, PDF and CDF, Bernoulli, Categorical and Multinomial Distributions
  3. Binomial, Categorical, Multinomial, Poisson, Student's T, Laplace, Gamma, Beta and Pareto Distributions
  4. Variance/Covariance, Correlation/Independence, Transformation of Densities, Multivariate Gaussian, Dirichlet and Student's T
  5. Central Limit Theorem, Markov and Chebyshev Inequalities, Law of Large Numbers, and Introduction to MC Methods
  6. Introduction to Information Theory
  7. Introduction to Bayesian Statistics
  8. Bayesian Inference for the Mean and Precision for the Univariate and Multivariate Gaussian
  9. Exponential Family of Distributions
  10. Generalized Linear Models and the Exponential Family, Conditional Gaussian Systems, Information Form of the Gaussian
  11. Prior Hierarchical Models
  12. Introduction to Bayesian Linear Regression
  13. Bayesian Model Selection
  14. Bayesian Linear Regression (Continued)
  15. Implementation of Bayesian Regression and Variable Selection
  16. Implementation of Bayesian Regression and Variable Selection (Continued)
  17. The evidence approximation, Variable and (Regression) model selection
  18. Introduction to Monte Carlo Methods, Sampling from Discrete and Continuum Distributions
  19. Reverse Sampling, Transformation Methods, Composition Methods, Accept-Reject Methods, Stratified/Systematic Sampling
  20. Accept-Reject Methods, Stratified/Systematic Sampling and Introduction to Importance Sampling
  21. Importance Sampling
  22. Gibbs Sampling
  23. Markov Chain Monte Carlo and Metropolis-Hasting Algorithm
  24. Introduction to State Space Models and Sequential Importance Sampling
  25. Sequential Importance Sampling (continued)
  26. Sequential Importance Sampling with Resampling
  27. Sequential Importance Sampling with Resampling (Continued)
  28. Sequential Monte Carlo (Continued) and Conditional Linear Gaussian Models

Homework


Course info and references

Credit: 4 Units

Lectures: Tuesdays & Thursdays 12:30 -- 1:45 pm, DeBartolo Hall 126.

Makeup Lectures / Recitation: Fridays 11:30 -- 12:20 pm, DeBartolo Hall 126.

Professor: Nicholas Zabaras, 311 I Cushing Hall, nzabaras@gmail.com

Teaching Assistants: Nicholas Geneva, ngeneva@nd.edu, Govinda Anantha-Padmanabha, ganantha@nd.edu, Navid Shervani-Tabar, nshervan@nd.edu

Office hours: (NZ) Mondays and Fridays, 1-2 pm (also by appointment), 311 I Cushing; (TAs) Mondays 5:00 -- 7:00 p.m., 125 DeBartolo Hall.

Course description: The course covers selective topics on Bayesian scientific computing relevant to high-dimensional data-driven engineering and scientific applications. An overview of Bayesian computational statistics methods will be provided including Monte Carlo methods, exploration of posterior distributions, model selection and validation, MCMC and Sequential MC methods and inference in probabilistic graphical models. Bayesian techniques for building surrogate models of expensive computer codes will be introduced including regression methods for uncertainty quantification, Gaussian process modeling and others. The course will demonstrate these techniques with a variety of scientific and engineering applications including among others inverse problems, dynamical system identification, tracking and control, uncertainty quantification of complex multiscale systems, physical modeling in random media, and optimization/design in the presence of uncertainties. The students will be encouraged to integrate the course tools with their own research topics.

Intended audience: Graduate Students in Mathematics/Statistics, Computer Science, Engineering, Physical/Chemical/Biological/Life Sciences.

References of General Interest: The course lectures will become available on the course web site. For in depth study, a list of articles and book chapters from the current literature will also be provided to enhance the material of the lectures. There is no required text for this course. Some important books that can be used for general background reading in the subject areas of the course include the following:

Homework: Will be assigned approximately every three-four lectures. Most of the homework will require implementation and application of algorithms discussed in class. We anticipate between five to seven homework sets. All homework solutions and affiliated computer programs should be submitted on the course Sakai page by midnight of the due date. Late homework will be accepted for up to three days past the original deadline being penalized by 10% a day. All attachments should arrive in a named zipped directory (e.g. HW1_LastName_FirstName.zip). We would prefer the homework to be in a typed format (include in your submission all original files e.g. Latex and a Readme file for compiling and testing your software).

Term project: A project is required in mathematical or computational aspects of the course. Students are encouraged to investigate aspects of Bayesian computing relevant to their own research. A short written report (in the format of NIPS papers) is required as well as a presentation. Project presentations will be given at the end of the semester as part of a day or two long symposium.

Grading: Homework 60% and Project 40%.

Prerequisites: Linear Algebra, Probability theory, Introduction to Statistics and Programming (any language). The course will require significant effort especially from those not familiar with computational statistics. It is a course intended for those that value the role of Bayesian inference and machine learning on their research.


Syllabus

  1. Review of probability and statistics
  2. Introduction to Bayesian Statistics
  3. Introduction to Monte Carlo Methods
  4. Markov Chain Monte Carlo Methods
  5. Sequential Monte Carlo Methods and applications
  6. Uncertainty Quantification Methods
  7. Uncertainty Quantification using Graph Theoretic Approaches