Entropy | |
Duality of Maximum Entropy and Minimum Divergence | |
Shinto Eguchi1  Osamu Komori2  | |
[1] The Institute of Statistical Mathematics and The Graduate University of Advanced Studies, Tachikawa Tokyo 190-8562, |
|
关键词:
|
|
DOI : 10.3390/e16073552 | |
来源: mdpi | |
【 摘 要 】
We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example.
【 授权许可】
CC BY
© 2014 by the authors; licensee MDPI, Basel, Switzerland
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202003190024257ZK.pdf | 286KB | download |