期刊论文详细信息
JOURNAL OF COMPUTATIONAL PHYSICS 卷:374
Gradient-based optimization for regression in the functional tensor-train format
Article
Gorodetsky, Alex A.1  Jakeman, John D.2 
[1] Univ Michigan, 3053 FXB,1320 Beal Ave, Ann Arbor, MI 48109 USA
[2] Sandia Natl Labs, Optimizat & Uncertainty Quantificat, Albuquerque, NM 87123 USA
关键词: Tensors;    Regression;    Function approximation;    Uncertainty quantification;    Alternating least squares;    Stochastic gradient descent;   
DOI  :  10.1016/j.jcp.2018.08.010
来源: Elsevier
PDF
【 摘 要 】

Predictive analysis of complex computational models, such as uncertainty quantification (UQ), must often rely on using an existing database of simulation runs. In this paper we consider the task of performing low-multilinear-rank regression on such a database. Specifically we develop and analyze an efficient gradient computation that enables gradient-based optimization procedures, including stochastic gradient descent and quasi-Newton methods, for learning the parameters of a functional tensor-train (FT). We compare our algorithms with 22 other nonparametric and parametric regression methods on 10 real-world data sets and show that for many physical systems, exploiting low-rank structure facilitates efficient construction of surrogate models. We use a number of synthetic functions to build insight into behavior of our algorithms, including the rank adaptation and group-sparsity regularization procedures that we developed to reduce overfitting. Finally we conclude the paper by building a surrogate of a physical model of a propulsion plant on a naval vessel. (C) 2018 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jcp_2018_08_010.pdf 783KB PDF download
  文献评价指标  
  下载次数:0次 浏览次数:0次