†" /> 期刊论文

期刊论文详细信息
Entropy
On Shannon’s Formula and Hartley’s Rule: Beyond the Mathematical Coincidence
Olivier Rioul1 
[1] Télécom ParisTech, Institut Mines-Télécom, CNRS LTCI, 46 Rue Barrault, 75013, Paris, France
关键词: Shannon’s formula;    Hartley’s rule;    additive noise channel;    differential entropy;    channel capacity;    signal-to-noise ratio;    pulse-amplitude modulation (PAM);    additive white Gaussian noise (AWGN) channel;    uniform noise channel;    characteristic function;    uniform B-spline function;    uniform sum distribution;    central limit theorem;   
DOI  :  10.3390/e16094892
来源: mdpi
PDF
【 摘 要 】

In the information theory community, the following “historical” statements are generally well accepted: (1) Hartley did put forth his rule twenty years before Shannon; (2) Shannon’s formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came out unexpected in 1948; (3) Hartley’s rule is inexact while Shannon’s formula is characteristic of the additive white Gaussian noise channel; (4) Hartley’s rule is an imprecise relation that is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are somewhat wrong. In fact, a careful calculation shows that “Hartley’s rule” in fact coincides with Shannon’s formula. We explain this mathematical coincidence by deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon’s formula and construct a sequence of such channels that makes the link between the uniform (Hartley) and Gaussian (Shannon) channels.

【 授权许可】

CC BY   
© 2014 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190021694ZK.pdf 415KB PDF download
  文献评价指标  
  下载次数:13次 浏览次数:43次