期刊论文详细信息
PHYSICA D-NONLINEAR PHENOMENA 卷:405
Time-series learning of latent-space dynamics for reduced-order model closure
Article
Maulik, Romit1  Mohan, Arvind2,4  Lusch, Bethany1  Madireddy, Sandeep3  Balaprakash, Prasanna1,3  Livescu, Daniel4 
[1] Argonne Natl Lab, Argonne Leadership Comp Facil, Lemont, IL 60439 USA
[2] Los Alamos Natl Lab, Ctr Nonlinear Studies, Los Alamos, NM 87545 USA
[3] Argonne Natl Lab, Math & Comp Sci Div, Lemont, IL 60439 USA
[4] Los Alamos Natl Lab, Comp Computat & Stat Sci Div, Los Alamos, NM 87545 USA
关键词: ROMs;    LSTMs;    Neural ODEs;    Closures;   
DOI  :  10.1016/j.physd.2020.132368
来源: Elsevier
PDF
【 摘 要 】

We study the performance of long short-term memory networks (LSTMs) and neural ordinary differential equations (NODEs) in learning latent-space representations of dynamical equations for an advection-dominated problem given by the viscous Burgers equation. Our formulation is devised in a nonintrusive manner with an equation-free evolution of dynamics in a reduced space with the latter being obtained through a proper orthogonal decomposition. In addition, we leverage the sequential nature of learning for both LSTMs and NODEs to demonstrate their capability for closure in systems that are not completely resolved in the reduced space. We assess our hypothesis for two advection-dominated problems given by the viscous Burgers equation. We observe that both LSTMs and NODEs are able to reproduce the effects of the absent scales for our test cases more effectively than does intrusive dynamics evolution through a Galerkin projection. This result empirically suggests that time-series learning techniques implicitly leverage a memory kernel for coarse-grained system closure as is suggested through the Mori-Zwanzig formalism. (c) 2020 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_physd_2020_132368.pdf 4078KB PDF download
  文献评价指标  
  下载次数:3次 浏览次数:0次