期刊论文详细信息
Sensors
Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
Hongying Liu1  Yuanyuan Liu1  Shuang Wang1  Bingkun Wei1  Licheng Jiao1  Fanhua Shang1 
[1] Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an 710071, China;
关键词: sparse learning;    hard thresholding;    stochastic optimization;    variance reduction;   
DOI  :  10.3390/s20174902
来源: DOAJ
【 摘 要 】

In recent years, a series of matching pursuit and hard thresholding algorithms have been proposed to solve the sparse representation problem with 0-norm constraint. In addition, some stochastic hard thresholding methods were also proposed, such as stochastic gradient hard thresholding (SG-HT) and stochastic variance reduced gradient hard thresholding (SVRGHT). However, each iteration of all the algorithms requires one hard thresholding operation, which leads to a high per-iteration complexity and slow convergence, especially for high-dimensional problems. To address this issue, we propose a new stochastic recursive gradient support pursuit (SRGSP) algorithm, in which only one hard thresholding operation is required in each outer-iteration. Thus, SRGSP has a significantly lower computational complexity than existing methods such as SG-HT and SVRGHT. Moreover, we also provide the convergence analysis of SRGSP, which shows that SRGSP attains a linear convergence rate. Our experimental results on large-scale synthetic and real-world datasets verify that SRGSP outperforms state-of-the-art related methods for tackling various sparse representation problems. Moreover, we conduct many experiments on two real-world sparse representation applications such as image denoising and face recognition, and all the results also validate that our SRGSP algorithm obtains much better performance than other sparse representation learning optimization methods in terms of PSNR and recognition rates.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:4次