期刊论文详细信息
Entropy
Intersection Information Based on Common Randomness
Virgil Griffith2  Edwin K. P. Chong3  Ryan G. James1  Christopher J. Ellison4 
[1] Department of Computer Science, University of Colorado, Boulder, CO 80309, USA; E-Mail:;Computation and Neural Systems, Caltech, Pasadena, CA 91125, USA;Dept. of Electrical & Computer Engineering, Colorado State University, Fort Collins, CO 80523, USA; E-Mail:;Center for Complexity and Collective Computation, Wisconsin Institute for Discovery, University of Wisconsin-Madison, Madison, WI 53715, USA; E-Mail:
关键词: intersection information;    partial information decomposition;    lattice;    Gács-Körner;    synergy;    redundant information;   
DOI  :  10.3390/e16041985
来源: mdpi
PDF
【 摘 要 】

The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory. A palatable measure of intersection information would provide a principled way to quantify slippery concepts, such as synergy. Here, we introduce an intersection information measure based on the Gács-Körner common random variable that is the first to satisfy the coveted target monotonicity property. Our measure is imperfect, too, and we suggest directions for improvement.

【 授权许可】

CC BY   
© 2014 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190027268ZK.pdf 516KB PDF download
  文献评价指标  
  下载次数:12次 浏览次数:23次