期刊论文详细信息
JOURNAL OF COMPUTATIONAL PHYSICS 卷:384
Neural-net-induced Gaussian process regression for function approximation and PDE solution
Article
Pang, Guofei1  Yang, Liu1  Karniadakis, George E. M.1 
[1] Brown Univ, Div Appl Math, Providence, RI 02912 USA
关键词: NN-induced Gaussian process;    Neural network;    Machine learning;    Partial differential equation;    Uncertainty quantification;   
DOI  :  10.1016/j.jcp.2019.01.045
来源: Elsevier
PDF
【 摘 要 】

Neural-net-induced Gaussian process (NNGP) regression inherits both the high expressivity of deep neural networks (deep NNs) as well as the uncertainty quantification property of Gaussian processes (GPs). We generalize the current NNGP to first include a larger number of hyperparameters and subsequently train the model by maximum likelihood estimation. Unlike previous works on NNGP that targeted classification, here we apply the generalized NNGP to function approximation and to solving partial differential equations (PDEs). Specifically, we develop an analytical iteration formula to compute the covariance function of GP induced by deep NN with an error-function nonlinearity. We compare the performance of the generalized NNGP for function approximations and PDE solutions with those of GPs and fully-connected NNs. We observe that for smooth functions the generalized NNGP can yield the same order of accuracy with GP, while both NNGP and GP outperform deep NN. For non-smooth functions, the generalized NNGP is superior to GP and comparable or superior to deep NN. (C) 2019 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jcp_2019_01_045.pdf 2749KB PDF download
  文献评价指标  
  下载次数:1次 浏览次数:0次