期刊论文详细信息
JOURNAL OF MULTIVARIATE ANALYSIS 卷:136
Parametric and semiparametric reduced-rank regression with flexible sparsity
Article
Lian, Heng1,3  Feng, Sanying2  Zhao, Kaifeng1 
[1] Nanyang Technol Univ, Div Math Sci, Singapore 637371, Singapore
[2] Beijing Univ Technol, Coll Appl Sci, Beijing 100124, Peoples R China
[3] Univ New S Wales, Sch Math & Stat, Sydney, NSW 2052, Australia
关键词: Additive models;    Oracle inequality;    Reduced-rank regression;    Sparse group lasso;   
DOI  :  10.1016/j.jmva.2015.01.013
来源: Elsevier
PDF
【 摘 要 】

We consider joint rank and variable selection in multivariate regression. Previously proposed joint rank and variable selection approaches assume that different responses are related to the same set of variables, which suggests using a group penalty on the rows of the coefficient matrix. However, this assumption may not hold in practice and motivates the usual lasso (l(1)) penalty on the coefficient matrix. We propose to use the gradient-proximal algorithm to solve this problem, which is a recent development in optimization. We also present some theoretical results for the proposed estimator with the l(1) penalty. We then consider several extensions including adaptive lasso penalty, sparse group penalty, and additive models. The proposed methodology thus offers a much more complete set of tools in high-dimensional multivariate regression. Finally, we present numerical illustrations based on simulated and real data sets. (C) 2015 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jmva_2015_01_013.pdf 480KB PDF download
  文献评价指标  
  下载次数:9次 浏览次数:1次