Entropy | |
Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different | |
Keisuke Yano1  | |
[1] Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, |
|
关键词: Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator; | |
DOI : 10.3390/e16063026 | |
来源: mdpi | |
【 摘 要 】
We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model.
【 授权许可】
CC BY
© 2014 by the authors; licensee MDPI, Basel, Switzerland
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202003190025580ZK.pdf | 292KB | download |