期刊论文详细信息
Entropy
Objective Bayesianism and the Maximum Entropy Principle
Jürgen Landes1 
[1] Department of Philosophy, School of European Culture and Languages, University of Kent, Canterbury CT2 7NF, UK; E-Mail
关键词: objective Bayesianism;    g-entropy;    generalised entropy;    Bayesian conditionalisation;    scoring rule;    maximum entropy;    maxent;    minimax;   
DOI  :  10.3390/e15093528
来源: mdpi
PDF
【 摘 要 】

Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.

【 授权许可】

CC BY   
© 2013 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190033337ZK.pdf 731KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:5次