IEEE Access | |
Variational Autoencoder With Optimizing Gaussian Mixture Model Priors | |
Huahua Chen1  Jianwu Zhang1  Jialuo Zhou1  Chunsheng Guo1  Na Ying1  Di Zhou2  | |
[1] Hangzhou Dianzi University, Hangzhou, China;Zhejiang Uniview Technologies Company, Ltd., Hangzhou, China; | |
关键词: Variational autoencoder; Gaussian mixture model; Kullback-Leibler distance; | |
DOI : 10.1109/ACCESS.2020.2977671 | |
来源: DOAJ |
【 摘 要 】
The latent variable prior of the variational autoencoder (VAE) often utilizes a standard Gaussian distribution because of the convenience in calculation, but has an underfitting problem. This paper proposes a variational autoencoder with optimizing Gaussian mixture model priors. This method utilizes a Gaussian mixture model to construct prior distribution, and utilizes the Kullback-Leibler (KL) distance between posterior and prior distribution to implement an iterative optimization of the prior distribution based on the data. The greedy algorithm is used to solve the KL distance for defining the approximate variational lower bound solution of the loss function, and for realizing the VAE with optimizing Gaussian mixture model priors. Compared with the standard VAE method, the proposed method obtains state-of-the-art results on MNIST, Omniglot, and Frey Face datasets, which shows that the VAE with optimizing Gaussian mixture model priors can learn a better model.
【 授权许可】
Unknown