期刊论文详细信息
EURASIP Journal on Advances in Signal Processing
Adaptive independent sticky MCMC algorithms
Luca Martino1 
关键词: Bayesian inference;    Monte Carlo methods;    Adaptive Markov chain Monte Carlo (MCMC);    Adaptive rejection Metropolis sampling (ARMS);    Gibbs sampling;    Metropolis-within-Gibbs;    Hit and run algorithm;   
DOI  :  10.1186/s13634-017-0524-6
学科分类:计算机科学(综合)
来源: SpringerOpen
PDF
【 摘 要 】

Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in different fields, such as computational statistics, machine learning, and statistical signal processing. In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky Markov Chain Monte Carlo (MCMC) algorithms, to sample efficiently from any bounded target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities, which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively from previously drawn samples. The algorithm’s efficiency is ensured by a test that supervises the evolution of the set of support points. This extra stage controls the computational cost and the convergence of the proposal density to the target. Each part of the novel family of algorithms is discussed and several examples of specific methods are provided. Although the novel algorithms are presented for univariate target densities, we show how they can be easily extended to the multivariate context by embedding them within a Gibbs-type sampler or the hit and run algorithm. The ergodicity is ensured and discussed. An overview of the related works in the literature is also provided, emphasizing that several well-known existing methods (like the adaptive rejection Metropolis sampling (ARMS) scheme) are encompassed by the new class of algorithms proposed here. Eight numerical examples (including the inference of the hyper-parameters of Gaussian processes, widely used in machine learning for signal processing applications) illustrate the efficiency of sticky schemes, both as stand-alone methods to sample from complicated one-dimensional pdfs and within Gibbs samplers in order to draw from multi-dimensional target distributions.

【 授权许可】

Unknown   

【 预 览 】
附件列表
Files Size Format View
RO201902198657246ZK.pdf 1701KB PDF download
  文献评价指标  
  下载次数:3次 浏览次数:15次