科技报告详细信息
A Kullback-Leibler Divergence Based Kernel for SVM Classification in Multimedia
Moreno, Pedro J. ; Ho, Purdy P. ; Vasconcelos, Nuno
HP Development Company
关键词: support vector machine;    SVM;    speaker identification;    speaker verification;    KL divergence;    Kullback-Leibler divergence;    probabilistic distance kernels;    multimedia;   
RP-ID  :  HPL-2004-4
学科分类:计算机科学(综合)
美国|英语
来源: HP Labs
PDF
【 摘 要 】

Over the last years significant efforts have been made to develop kernels that can be applied to sequence data such as DNA, text, speech, video and images. The Fisher Kernel and similar variants have been suggested as good ways to combine an underlying generative model in the feature space and discriminant classifiers such as SVM's. In this paper we suggest an alternative procedure to the Fisher kernel for systematically finding kernel functions that naturally handle variable length sequence data in multimedia domains. In particular for domains such as speech and images we explore the use of kernel functions that take full advantage of well known probabilistic models such as Gaussian Mixtures and single full covariance Gaussian models. We derive a kernel distance based on the Kullback-Leibler (KL) divergence between generative models. In effect our approach combines the best of both generative and discriminative methods and replaces the standard SVM kernels. We perform experiments on speaker identification/verification and image classification tasks and show that these new kernels have the best performance in speaker verification and mostly outperform the Fisher kernel based SVM's and the generative classifiers in speaker identification and image classification. Notes: This article was published in Advances in Neural Information Processing Systems 16, MIT Press 9 Pages

【 预 览 】
附件列表
Files Size Format View
RO201804100001236LZ 122KB PDF download
  文献评价指标  
  下载次数:26次 浏览次数:69次