# A Gentle Introduction to Markov Chain Monte Carlo (MCMC)

Applying probabilistic models to data usually involves integrating a complex, multi-dimensional probability distribution. For example, calculating the expectation/mean of a model distribution involves such an integration. Many (most) times, these integrals are not calculable due to the high dimensionality of the distribution or because there is no closed-form expression for the integral available using calculus. Markov Chain Monte Carlo (MCMC) is a method that allows one to approximate complex integrals using stochastic sampling routines. As MCMC’s name indicates, the method is composed of two components, the ** Markov chain** and

**.**

*Monte Carlo integration*Monte Carlo integration* *is a powerful technique that exploits stochastic sampling of the distribution in question in order to approximate the difficult integration. However, in order to use* *Monte Carlo integration it is necessary to be able to sample from the probability distribution in question, which may be difficult or impossible to do directly. This is where the second component of MCMC, the Markov chain,** **comes in. A Markov chain is a sequential model that transitions from one state to another in a probabilistic fashion, where the next state that the chain takes is conditioned on the previous state. Markov chains are useful in that if they are constructed properly, and allowed to run for a long time, the states that a chain will take also sample from a target probability distribution. Therefore we can construct Markov chains to sample from the distribution whose integral we would like to approximate, then use Monte Carlo integration to perform the approximation.

Here I introduce a series of posts where I describe the basic concepts underlying MCMC, starting off by describing Monte Carlo Integration, then giving a brief introduction of Markov chains and how they can be constructed to sample from a target probability distribution. Given these foundation principles, we can then discuss MCMC techniques such as the Metropolis and Metropolis-Hastings algorithms, the Gibbs sampler, and the Hybrid Monte Carlo algorithm.

As always, each post has a somewhat formal/mathematical introduction, along with an example and simple Matlab implementations of the associated algorithms.

Posted on November 19, 2012, in Algorithms, MCMC, Sampling Methods, Statistics and tagged Gibbs Sampler, Hamiltonian Monte Carlo, Hybrid Monte Carlo, integral approximation, integration, Markov Chain, Markov Chain Monte Carlo, MCMC, Metropolis sampler, Metropolis-Hastings Sampler, Monte Carlo Integration. Bookmark the permalink. 5 Comments.

Great intro!

Thanks! :^)

I really got what I am looking for to do my adviser vague task.

Thank you, good luck for you and your sweet blog!

Pingback: Derivation: Maximum Likelihood for Boltzmann Machines | The Clever Machine

Pingback: Monte Carlo Integration