期刊论文详细信息
Entropy
Minimum Error Entropy Algorithms with Sparsity Penalty Constraints
Zongze Wu1  Siyuan Peng1  Wentao Ma2  Jose C. Principe2  Badong Chen2 
[1] School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510640, China;School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an 710049, China;
关键词: sparse estimation;    minimum error entropy;    correntropy induced metric;    mean square convergence;    impulsive noise;   
DOI  :  10.3390/e17053419
来源: DOAJ
【 摘 要 】

Recently, sparse adaptive learning algorithms have been developed to exploit system sparsity as well as to mitigate various noise disturbances in many applications. In particular, in sparse channel estimation, the parameter vector with sparsity characteristic can be well estimated from noisy measurements through a sparse adaptive filter. In previous studies, most works use the mean square error (MSE) based cost to develop sparse filters, which is rational under the assumption of Gaussian distributions. However, Gaussian assumption does not always hold in real-world environments. To address this issue, we incorporate in this work an l1-norm or a reweighted l1-norm into the minimum error entropy (MEE) criterion to develop new sparse adaptive filters, which may perform much better than the MSE based methods, especially in heavy-tailed non-Gaussian situations, since the error entropy can capture higher-order statistics of the errors. In addition, a new approximator of l0-norm, based on the correntropy induced metric (CIM), is also used as a sparsity penalty term (SPT). We analyze the mean square convergence of the proposed new sparse adaptive filters. An energy conservation relation is derived and a sufficient condition is obtained, which ensures the mean square convergence. Simulation results confirm the superior performance of the new algorithms.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:2次