Title
Statistical Regeneration Guarantees of the Wasserstein Autoencoder with Latent Space Consistency
Document Type
Conference Article
Publication Title
NeurIPS
Abstract
The introduction of Variational Autoencoders (VAE) has been marked as a breakthrough in the history of representation learning models. Besides having several accolades of its own, VAE has successfully flagged off a series of inventions in the form of its immediate successors. Wasserstein Autoencoder (WAE), being an heir to that realm carries with it all of the goodness and heightened generative promises, matching even the generative adversarial networks (GANs). Needless to say, recent years have witnessed a remarkable resurgence in statistical analyses of the GANs. Similar examinations for Autoencoders, however, despite their diverse applicability and notable empirical performance, remain largely absent. To close this gap, in this paper, we investigate the statistical properties of WAE. Firstly, we provide statistical guarantees that WAE achieves the target distribution in the latent space, utilizing the Vapnik–Chervonenkis (VC) theory. The main result, consequently ensures the regeneration of the input distribution, harnessing the potential offered by Optimal Transport of measures under the Wasserstein metric. This study, in turn, hints at the class of distributions WAE can reconstruct after suffering a compression in the form of a latent law.
First Page
1
Last Page
13
Publication Date
12-2021
Recommended Citation
Chakrabarty, Anish and Das, Swagatam, "Statistical Regeneration Guarantees of the Wasserstein Autoencoder with Latent Space Consistency" (2021). ISI Best Publications. 27.
https://digitalcommons.isical.ac.in/ibp/27