A review of multiple try MCMC algorithms for signal processing

Citation data:

Digital Signal Processing, ISSN: 1051-2004, Vol: 75, Page: 134-152

Publication Year:
2018
Captures 26
Readers 26
Mentions 2
References 1
Blog Mentions 1
Social Media 39
Tweets 39
Citations 3
Citation Indexes 3
Repository URL:
http://arxiv.org/abs/1801.09065
DOI:
10.1016/j.dsp.2018.01.004
Author(s):
Martino, Luca
Publisher(s):
Elsevier BV
Tags:
Computer Science; Engineering; Statistics - Computation; Statistics - Machine Learning
Most Recent Tweet View All Tweets
Most Recent Blog Mention
review description
Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of a-posteriori estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a thorough review of MCMC methods using multiple candidates in order to select the next state of the chain, at each iteration. With respect to the classical Metropolis–Hastings method, the use of multiple try techniques foster the exploration of the sample space. We present different Multiple Try Metropolis schemes, Ensemble MCMC methods, Particle Metropolis–Hastings algorithms and the Delayed Rejection Metropolis technique. We highlight limitations, benefits, connections and differences among the different methods, and compare them by numerical simulations.