期刊论文详细信息
Entropy
Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
Carlos A. L. Pires1 
关键词: mutual information;    non-Gaussianity;    maximum entropy distributions;    non Gaussian noise;   
DOI  :  10.3390/e14061103
来源: mdpi
PDF
【 摘 要 】

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios.

【 授权许可】

CC BY   
© 2012 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190043482ZK.pdf 663KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:11次