期刊论文详细信息
Entropy
Intersection Information Based on Common Randomness
Christopher J. Ellison1  James P. Crutchfield2  Virgil Griffith3  Ryan G. James4  Edwin K. P. Chong5 
[1] Center for Complexity and Collective Computation, Wisconsin Institute for Discovery, University of Wisconsin-Madison, Madison, WI 53715, USA;Complexity Sciences Center and Physics Dept, University of California Davis, Davis, CA 95616, USA;Computation and Neural Systems, Caltech, Pasadena, CA 91125, USA;Department of Computer Science, University of Colorado, Boulder, CO 80309, USA;Dept. of Electrical & Computer Engineering, Colorado State University, Fort Collins, CO 80523, USA;
关键词: intersection information;    partial information decomposition;    lattice;    Gács–Körner;    synergy;    redundant information;   
DOI  :  10.3390/e16041985
来源: DOAJ
【 摘 要 】

The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory. A palatable measure of intersection information would provide a principled way to quantify slippery concepts, such as synergy. Here, we introduce an intersection information measure based on the Gács-Körner common random variable that is the first to satisfy the coveted target monotonicity property. Our measure is imperfect, too, and we suggest directions for improvement.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:2次