期刊论文详细信息
JOURNAL OF MULTIVARIATE ANALYSIS 卷:173
Rank reduction for high-dimensional generalized additive models
Article
Lin, Hongmei1  Lian, Heng2  Liang, Hua3 
[1] Shanghai Univ Int Business & Econ, Sch Stat & Informat, Shanghai, Peoples R China
[2] City Univ Hong Kong, Dept Math, Kowloon Tong, Hong Kong, Peoples R China
[3] George Washington Univ, Dept Stat, Washington, DC 20052 USA
关键词: Asymptotic normality;    B-splines;    Latent functions;    Logistic regression;   
DOI  :  10.1016/j.jmva.2019.05.005
来源: Elsevier
PDF
【 摘 要 】

When a regression problem contains multiple predictors, additive models avoid the difficulty of fitting multivariate functions and at the same time retain some nonlinearity of the model. When the dimension is high, the necessity to estimate a large number of functions, even though univariate, can cause concerns regarding statistical efficiency. We propose a rank reduction approach that assumes that all functions share a small common set of latent functions, which allows borrowing information from a large number of functions. The idea is general and could be used in any model with a large number of functions to estimate, but here we restrict our attention to generalized additive models, especially logistic models, that can deal with discrete responses and is useful for classification. Numerical results are reported to illustrate the finite sample performance of the estimator. We also establish an improved convergence rate of the rank reduction approach compared to the standard estimator and extend it to sparse modeling to deal with an even larger number of predictors. (C) 2019 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jmva_2019_05_005.pdf 425KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:0次