Modern Monte Carlo methods II: Markov chain Monte Carlo

Tom Griffiths
University of California, Berkeley (UC Berkeley)

Performing the computations involved in Bayesian inference can be extremely demanding, particularly with large hypothesis spaces. I will talk about one of the main tools that has been developed in statistics for working with complex probabilistic models -- Markov chain Monte Carlo (MCMC). The basic idea behind MCMC is to define a Markov chain that has the distribution from which we seek to sample as its stationary distribution. I will outline the two main methods -- Metropolis-Hastings and Gibbs sampling -- that are used to do this, and discuss their application through an example.


Presentation (PowerPoint File)

Back to Graduate Summer School: Probabilistic Models of Cognition: The Mathematics of Mind