Fluids | |
Data-Targeted Prior Distribution for Variational AutoEncoder | |
David Ryckelynck1  Thomas Daniel2  Nissrine Akkari2  Fabien Casenave2  | |
[1] Centre des Matériaux (CMAT), CNRS UMR 7633, Mines ParisTech, PSL University, BP 87, 91003 Evry, France;Safran Tech, Digital Sciences and Technologies Department, Rue des Jeunes Bois, Châteaufort, 78114 Magny-Les-Hameaux, France; | |
关键词: kernel proper orthogonal decomposition (KPOD); variational autoencoder (VAE); inference model; prior distribution; Gaussian kernel; unsteady and compressible fluid flows; | |
DOI : 10.3390/fluids6100343 | |
来源: DOAJ |
【 摘 要 】
Bayesian methods were studied in this paper using deep neural networks. We are interested in variational autoencoders, where an encoder approaches the true posterior and the decoder approaches the direct probability. Specifically, we applied these autoencoders for unsteady and compressible fluid flows in aircraft engines. We used inferential methods to compute a sharp approximation of the posterior probability of these parameters with the transient dynamics of the training velocity fields and to generate plausible velocity fields. An important application is the initialization of transient numerical simulations of unsteady fluid flows and large eddy simulations in fluid dynamics. It is known by the Bayes theorem that the choice of the prior distribution is very important for the computation of the posterior probability, proportional to the product of likelihood with the prior probability. Hence, we propose a new inference model based on a new prior defined by the density estimate with the realizations of the kernel proper orthogonal decomposition coefficients of the available training data. We numerically show that this inference model improves the results obtained with the usual standard normal prior distribution. This inference model was constructed using a new algorithm improving the convergence of the parametric optimization of the encoder probability distribution that approaches the posterior. This latter probability distribution is data-targeted, similarly to the prior distribution. This new generative approach can also be seen as an improvement of the kernel proper orthogonal decomposition method, for which we do not usually have a robust technique for expressing the pre-image in the input physical space of the stochastic reduced field in the feature high-dimensional space with a kernel inner product.
【 授权许可】
Unknown