期刊论文详细信息
Entropy
Nonparametric Estimation of Information-Based Measures of Statistical Dispersion
Lubomir Kostal1 
[1] Institute of Physiology, Academy of Sciences of the Czech Republic, Videnska 1083, 142 20 Prague, Czech Republic; E-Mail
关键词: statistical dispersion;    entropy;    Fisher information;    nonparametric density estimation;   
DOI  :  10.3390/e14071221
来源: mdpi
PDF
【 摘 要 】

We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity.

【 授权许可】

CC BY   
© 2012 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190043000ZK.pdf 288KB PDF download
  文献评价指标  
  下载次数:9次 浏览次数:15次