NEUROCOMPUTING | 卷:167 |
Metric and non-metric proximity transformations at linear costs | |
Article | |
Gisbrecht, Andrej1  Schleif, Frank-Michael2  | |
[1] Univ Bielefeld, Ctr Excellence, Theoret Comp Sci, D-33615 Bielefeld, Germany | |
[2] Univ Birmingham, Sch Comp Sci, Birmingham B15 2TT, W Midlands, England | |
关键词: Dissimilarity learning; Linear eigenvalue correction; Nystrom approximation; Double centering; Pseudo-Euclidean; Indefinite kernel; | |
DOI : 10.1016/j.neucom.2015.04.017 | |
来源: Elsevier | |
【 摘 要 】
Domain specific (dis-)similarity or proximity measures used e.g. in alignment algorithms of sequence data are popular to analyze complicated data objects and to cover domain specific data properties. Without an underlying vector space these data are given as pairwise (dis-)similarities only. The few available methods for such data focus widely on similarities and do not scale to large datasets. Kernel methods are very effective for metric similarity matrices, also at large scale, but costly transformations are necessary starting with non-metric (dis-) similarities. We propose an integrative combination of Nystrom approximation, potential double centering and eigenvalue correction to obtain valid kernel matrices at linear costs in the number of samples. By the proposed approach effective kernel approaches become accessible. Experiments with several larger (dis-)similarity datasets show that the proposed method achieves much better runtime performance than the standard strategy while keeping competitive model accuracy. The main contribution is an efficient and accurate technique, to convert (potentially non-metric) large scale dissimilarity matrices into approximated positive semi-definite kernel matrices at linear costs. (C) 2015 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_neucom_2015_04_017.pdf | 1128KB | download |