**The Bayesian computational revolutions**

**Gareth Roberts**

The 20th Century was certainly a highly successful period for the formulation of statistical theory and methodology. Moreover classical statistical methods were by and large taken up much more widely than their Bayesian counterparts. This was undoubtably caused by the simple fact that classical statistical methods were generally easier to apply. To a large extent, this all changed in the last decade of the century with the emergence of the first wave of Bayesian computational statistics techniques: notably the Gibbs sampler and more generally Markov chain Monte Carlo methods, and slightly later Sequential Monte Carlo methods. There followed a period of 15 years of unprecedented growth in Bayesian statistics to the point where Bayesian statistics is now firmly established in virtually all areas of science.

Now 23 years on from the Gelfand and Smith paper which kicked off this computational revolution, with the advent of massive data sets, and even more complex models, the most challenging scientific problems are again beyond the reach of existing Bayesian methodologies. However new methodologies are emerging to try and address these challenges. Are we on the verge of a second revolution?

Regresar