学位论文详细信息
Topics on Reduced Rank Methods for Multivariate Regression.
Reduced Rank Regression;Statistics and Numeric Data;Science;Statistics
Mukherjee, AshinShedden, Kerby A. ;
University of Michigan
关键词: Reduced Rank Regression;    Statistics and Numeric Data;    Science;    Statistics;   
Others  :  https://deepblue.lib.umich.edu/bitstream/handle/2027.42/99837/ashinm_1.pdf?sequence=1&isAllowed=y
瑞士|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

Multivariate regression is a generalization of the univariate regression to the case where we are interested in predicting q(>1) responses that depend on the same set of features or predictors. Problems of this type are encountered commonly in many quantitative fields, such as bioinformatics, chemometrics, economics and engineering. The main goal is to build more accurate and interpretable models that can exploit the dependence structure among the responses and achieve appropriate dimension reduction. Reduced rank regression has been an important tool to this end due to its simplicity, computational efficiency and superior predictive performance. In the first part of this thesis wepropose a reduced rank ridge regression method, which is robust against collinearity among the predictors. It also allows us to extend the solution to the reproducing kernel Hilbert space (RKHS) setting. The second part studies the effective degrees of freedom for a general class of reduced rank estimators in the framework of Stein’s unbiased risk estimation (SURE). A finite sample exact unbiased estimator is derived that admits a closed form solution. This can be used to calculate popular model selection criteria such as BIC, Mallow’s Cp or GCV which provide a principled way of selecting the optimal rank. The proposed estimator is significantly different from the number of free parameters in the model, which is often taken as a heuristic estimate of the degrees of freedom of an estimation procedure. The final part deals with a non-parametric extension to reduced rank regression under high dimensional setting where many of the predictors might be non-informative. We propose a two step penalized regression approach based on spline approximations that encourages both variable selection and rank reduction. We prove rank selection consistency and also provide error bounds for the proposed method.

【 预 览 】
附件列表
Files Size Format View
Topics on Reduced Rank Methods for Multivariate Regression. 896KB PDF download
  文献评价指标  
  下载次数:15次 浏览次数:12次