期刊论文详细信息
Entropy
Distributed Vector Quantization Based on Kullback-Leibler Divergence
Pengcheng Shen2  Chunguang Li1  Yiliang Luo2 
[1] College of Information Science and Electronic Engineering, Zhejiang University, Hangzhou 310027, China
关键词: distributed signal processing;    Kullback-Leibler divergence;    sensor network;    vector quantization;   
DOI  :  10.3390/e17127851
来源: mdpi
PDF
【 摘 要 】

The goal of vector quantization is to use a few reproduction vectors to represent original vectors/data while maintaining the necessary fidelity of the data. Distributed signal processing has received much attention in recent years, since in many applications data are dispersedly collected/stored in distributed nodes over networks, but centralizing all these data to one processing center is sometimes impractical. In this paper, we develop a distributed vector quantization (VQ) algorithm based on Kullback-Leibler (K-L) divergence. We start from the centralized case and propose to minimize the K-L divergence between the distribution of global original data and the distribution of global reproduction vectors, and then obtain an online iterative solution to this optimization problem based on the Robbins-Monro stochastic approximation. Afterwards, we extend the solution to apply to distributed cases by introducing diffusion cooperation among nodes. Numerical simulations show that the performances of the distributed K-L–based VQ algorithm are very close to the corresponding centralized algorithm. Besides, both the centralized and distributed K-L–based VQ show more robustness to outliers than the (centralized) Linde-Buzo-Gray (LBG) algorithm and the (centralized) self-organization map (SOM) algorithm.

【 授权许可】

CC BY   
© 2015 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190002587ZK.pdf 1713KB PDF download
  文献评价指标  
  下载次数:7次 浏览次数:19次