期刊论文详细信息
PATTERN RECOGNITION 卷:77
Feature co-shrinking for co-clustering
Article
Tan, Qi1  Yang, Pei2,3  He, Jingrui3 
[1] South China Normal Univ, Dept Comp Sci & Engn, Guangzhou 510630, Guangdong, Peoples R China
[2] South China Univ Technol, Dept Comp Sci & Engn, Guangzhou 510641, Guangdong, Peoples R China
[3] Arizona State Univ, Dept Comp Sci & Engn, Tempe, AZ 85281 USA
关键词: Co-clustering;    Non-negative matrix tri-factorization;    Co-sparsity;    Co-feature-selection;   
DOI  :  10.1016/j.patcog.2017.12.005
来源: Elsevier
PDF
【 摘 要 】

Many real-world applications require multi-way feature selection rather than single-way feature selection. Multi-way feature selection is more challenging compared to single-way feature selection due to the presence of inter-correlation among the multi-way features. To address this challenge, we propose a novel non-negative matrix tri-factorization model based on co-sparsity regularization to facilitate feature co-shrinking for co-clustering. The basic idea is to learn the inter-correlation among the multi-way features while shrinking the irrelevant ones by encouraging the co-sparsity of the model parameters. The objective is to simultaneously minimize the loss function for the matrix tri-factorization, and the co-sparsity regularization imposed on the model. Furthermore, we develop an efficient and convergence-guaranteed algorithm to solve the non-smooth optimization problem, which works in an iteratively update fashion. The experimental results on various data sets demonstrate the effectiveness of the proposed approach. (C) 2017 Elsevier Ltd. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_patcog_2017_12_005.pdf 945KB PDF download
  文献评价指标  
  下载次数:7次 浏览次数:0次