期刊论文详细信息
Entropy
Approximated Information Analysis in Bayesian Inference
Jung In Seo1  Yongku Kim2 
[1] Department of Statistics, Yeungnam University, Gyeongsan 712-749, Korea; E-Mail:;Department of Statistics, Kyungpook National University, Daegu 702-701, Korea
关键词: Bayesian sensitivity;    Gibbs sampler;    Kullback;    Leibler divergence;    Laplace approximation;   
DOI  :  10.3390/e17031441
来源: mdpi
PDF
【 摘 要 】

In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.

【 授权许可】

CC BY   
© 2015 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190014903ZK.pdf 230KB PDF download
  文献评价指标  
  下载次数:12次 浏览次数:11次