You are on page 1of 1

Spring 2017

STAT 220: Bayesian Data Analysis


Instructor: Jun Liu; Office: 715 Science Center; Phone: 495-1600
Office Hours: Tu 2:30-3:30 & Th 12:00-1:00 or by appointment
Meetings: Tu 1:00-2:30, Th 1:00-2:30, SC 705.
Midterm Exam: 3/21/2017. Final Project: due 5/4/2017
Course Website: https://canvas.harvard.edu/courses/21804

COURSE DESCRIPTION: Begins with basic models, whose Bayesian answers are often similar to
classical ones, followed by hierarchical, mixture, and hidden-Markov models. Discusses Bayesian
philosophy, difficulties of statistical inference, frequentist and Bayesian comparisons, model check-
ing/selection, sensitivity analysis, variable selections. Introduces research topics in Markov chain
Monte Carlo, bioinformatics, Bayesian learning, signal processing, etc.
PREREQUISITE & REQUIREMENT: STAT 110 and STAT 111 or equivalent. Expected to par-
ticipate in class learning (10%), and to work on homeworks (30%), midterm exam (30%) and the
final project (30%). The following topics are to be covered (subject to changes).

1. Basics of the Bayesian inference: (a) setting up a probability model; (b) using the probability
theory and the Bayes rule; (c) simple examples: binomial, normal, and exponential families;
(d) basics of statistical inferences: challenges and key ideas; (e) non-informative priors; (f)
exchangeability and modeling; (g) asymptotic normality (informal introduction).
2. Multi-parameter models, normal with unknown mean and variance, the multivariate normal
distribution, multinomial models, Dirichlet process models, and examples. Computation and
simulation from arbitrary posterior distributions in two parameters.
3. Inference from large samples and comparison to standard non-Bayesian methods.
4. Hierarchical models, estimating several means, estimating population parameters from data,
meta analysis, and some examples. Emphasis on statistical thinking for complicated problems.
5. Computation I (non-iterative methods): integration, conjugate analysis, normal and Laplace
approximations, Monte Carlo, importance sampling, rejection sampling, sequential imputation.
6. Missing data problems and data augmentation methodology. Latent variable models; Student-t
models for robust inference and sensitivity analysis; Bayesian nonparametrics.
7. Computation II (iterative sampling methods): EM algorithm, Newton-Raphson method,
Metropolis-Hastings algorithms, Gibbs sampling, and other MCMC methods.
8. Model checking, sensitivity analysis, prior and posterior predictive checking; Bayesian model
selection for linear and logistic regression models.
9. Topics on Bayesian modeling and computation: hidden Markov models; Bayesian networks;
causal inference; variable selection; bioinformatics; Bayesian networks; finance; etc.
REFERENCES

Gelman, A., Carlin, J.B., Stern, H.S. and Rubin, D.B. (2014). Bayesian Data Analysis (3nd ed),
Chapman & Hall: London. (Textbook)

Liu, J.S. (2001). Monte Carlo Strategies in Scientific Computing, Springer-Verlag: New York.

Cox, D.R. (2006). Principles of Statistical Inference. Cambridge Univ. Press, Cambridge.

You might also like