期刊论文详细信息
Entropy
Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different
Keisuke Yano1 
[1] Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan; E-Mail:
关键词: Bayesian prediction;    Fisher information;    Kullback–Leibler divergence;    minimax;    predictive metric;    subminimax estimator;   
DOI  :  10.3390/e16063026
来源: mdpi
PDF
【 摘 要 】

We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model.

【 授权许可】

CC BY   
© 2014 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190025580ZK.pdf 292KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:25次