NEUROCOMPUTING | 卷:179 |
Kernel learning over the manifold of symmetric positive definite matrices for dimensionality reduction in a BCI application | |
Article | |
Sadatnejad, Khadijeh1  Ghidary, Saeed Shiry1  | |
[1] Amirkabir Univ Technol, Comp Engn & Informat Technol Dept, Tehran, Iran | |
关键词: Brain computer interface; Nonlinear dimensionality reduction; Kernel learning; Riemannian geometry; | |
DOI : 10.1016/j.neucom.2015.11.065 | |
来源: Elsevier | |
【 摘 要 】
In this paper, we propose a kernel for nonlinear dimensionality reduction over the manifold of Symmetric Positive Definite (SPD) matrices in a Motor Imagery (MI)-based Brain Computer Interface (BCI) application. The proposed kernel, which is based on Riemannian geometry, tries to preserve the topology of data points in the feature space. Topology preservation is the main challenge in nonlinear dimensionality reduction (NLDR). Our main idea is to decrease the non-Euclidean characteristics of the manifold by modifying the volume elements. We apply a conformal transform over data-dependent isometric mapping to reduce the negative eigen fraction to learn a data dependent kernel over the Riemannian manifolds. Multiple experiments were carried out using the proposed kernel for a dimensionality reduction of SPD matrices that describe the EEG signals of dataset IIa from BCI competition IV. The experiments show that this kernel adapts to the input data and leads to promising results in comparison with the most popular manifold learning methods and the Common Spatial Pattern (CSP) technique as a reference algorithm in BCI competitions. The proposed kernel is strong, particularly in the cases where data points have a complex and nonlinear separable distribution. (C) 2015 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_neucom_2015_11_065.pdf | 949KB | download |