| JOURNAL OF COMPUTATIONAL PHYSICS | 卷:441 |
| SelectNet: Self-paced learning for high-dimensional partial differential equations | |
| Article | |
| Gu, Yiqi1  Yang, Haizhao2  Zhou, Chao3  | |
| [1] Natl Univ Singapore, Dept Math, 10 Lower Kent Ridge Rd, Singapore 119076, Singapore | |
| [2] Purdue Univ, Dept Math, W Lafayette, IN 47907 USA | |
| [3] City Univ Hong Kong, Dept Math, Hong Kong, Peoples R China | |
| 关键词: High-dimensional PDEs; Deep neural networks; Self-paced learning; Selected sampling; Least square method; Convergence; | |
| DOI : 10.1016/j.jcp.2021.110444 | |
| 来源: Elsevier | |
PDF
|
|
【 摘 要 】
The least squares method with deep neural networks as function parametrization has been applied to solve certain high-dimensional partial differential equations (PDEs) successfully; however, its convergence is slow and might not be guaranteed even within a simple class of PDEs. To improve the convergence of the network-based least squares model, we introduce a novel self-paced learning framework, SelectNet, which quantifies the difficulty of training samples, treats samples equally in the early stage of training, and slowly explores more challenging samples, e.g., samples with larger residual errors, mimicking the human cognitive process for more efficient learning. In particular, a selection network and the PDE solution network are trained simultaneously; the selection network adaptively weighting the training samples of the solution network achieving the goal of self-paced learning. Numerical examples indicate that the proposed SelectNet model outperforms existing models on the convergence speed and the convergence robustness, especially for low-regularity solutions. (c) 2021 Elsevier Inc. All rights reserved.
【 授权许可】
Free
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| 10_1016_j_jcp_2021_110444.pdf | 2231KB |
PDF