期刊论文详细信息
JOURNAL OF COMPUTATIONAL PHYSICS 卷:357
Hidden physics models: Machine learning of nonlinear partial differential equations
Article
Raissi, Maziar1  Karniadakis, George Em1 
[1] Brown Univ, Div Appl Math, Providence, RI 02912 USA
关键词: Probabilistic machine learning;    System identification;    Bayesian modeling;    Uncertainty quantification;    Fractional equations;    Small data;   
DOI  :  10.1016/j.jcp.2017.11.039
来源: Elsevier
PDF
【 摘 要 】

While there is currently a lot of enthusiasm about big data, useful data is usually small and expensive to acquire. In this paper, we present a new paradigm of learning partial differential equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of partial differential equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier-Stokes, Schrodinger, Kuramoto-Sivashinsky, and time dependent linear fractional equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data. (c) 2017 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jcp_2017_11_039.pdf 3223KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:1次