Adaptive independence samplers

Citation data:

Statistics and Computing, ISSN: 0960-3174, Vol: 18, Issue: 4, Page: 409-420

Publication Year:
Usage 124
Downloads 111
Abstract Views 13
Captures 28
Readers 28
Citations 19
Citation Indexes 19
Repository URL:
Jonathan M. Keith; Dirk P. Kroese; George Y. Sofronov
Springer Nature
Mathematics; Decision Sciences; Computer Science; Adaptive; independence; samplers; Physical Sciences and Mathematics
article description
Markov chain Monte Carlo (MCMC) is an important computational technique for generating samples from non-standard probability distributions. A major challenge in the design of practical MCMC samplers is to achieve efficient convergence and mixing properties. One way to accelerate convergence and mixing is to adapt the proposal distribution in light of previously sampled points, thus increasing the probability of acceptance. In this paper, we propose two new adaptive MCMC algorithms based on the Independent Metropolis-Hastings algorithm. In the first, we adjust the proposal to minimize an estimate of the cross-entropy between the target and proposal distributions, using the experience of pre-runs. This approach provides a general technique for deriving natural adaptive formulae. The second approach uses multiple parallel chains, and involves updating chains individually, then updating a proposal density by fitting a Bayesian model to the population. An important feature of this approach is that adapting the proposal does not change the limiting distributions of the chains. Consequently, the adaptive phase of the sampler can be continued indefinitely. We include results of numerical experiments indicating that the new algorithms compete well with traditional Metropolis-Hastings algorithms. We also demonstrate the method for a realistic problem arising in Comparative Genomics. © 2008 Springer Science+Business Media, LLC.