EURASIP Journal on Advances in Signal Processing | |
Adaptive independent sticky MCMC algorithms | |
Roberto Casarin1  David Luengo2  Luca Martino3  Fabrizio Leisen4  | |
[1] Department of Economics, University Ca’ Foscari of Venice;Department of Signal Theory and Communications, Universidad Politécnica de Madrid;Image Processing Lab., University of Valencia;School of Mathematics, Statistics and Actuarial Sciences, University of Kent; | |
关键词: Bayesian inference; Monte Carlo methods; Adaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Gibbs sampling; Metropolis-within-Gibbs; | |
DOI : 10.1186/s13634-017-0524-6 | |
来源: DOAJ |
【 摘 要 】
Abstract Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in different fields, such as computational statistics, machine learning, and statistical signal processing. In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky Markov Chain Monte Carlo (MCMC) algorithms, to sample efficiently from any bounded target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities, which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively from previously drawn samples. The algorithm’s efficiency is ensured by a test that supervises the evolution of the set of support points. This extra stage controls the computational cost and the convergence of the proposal density to the target. Each part of the novel family of algorithms is discussed and several examples of specific methods are provided. Although the novel algorithms are presented for univariate target densities, we show how they can be easily extended to the multivariate context by embedding them within a Gibbs-type sampler or the hit and run algorithm. The ergodicity is ensured and discussed. An overview of the related works in the literature is also provided, emphasizing that several well-known existing methods (like the adaptive rejection Metropolis sampling (ARMS) scheme) are encompassed by the new class of algorithms proposed here. Eight numerical examples (including the inference of the hyper-parameters of Gaussian processes, widely used in machine learning for signal processing applications) illustrate the efficiency of sticky schemes, both as stand-alone methods to sample from complicated one-dimensional pdfs and within Gibbs samplers in order to draw from multi-dimensional target distributions.
【 授权许可】
Unknown