NeurIPS 2020

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows


Meta Review

The authors introduce a novel conceptual framework that unifies normalizing flows and VAEs and includes many other existing models and modules such as augmented flows and variational dequantization. The framework involves thinking about generative models in terms of the type of mapping they use to go from the observation to the latents and vice versa. This turns out to be fruitful because it immediately makes apparent the gap between flows, which use deterministic mappings in both directions, and VAEs, which use stochastic mappings. The authors fill this gap by introducing surjective models/components which are deterministic in one of the directions and stochastic in the other, and proceed to derive several instances of these, e.g. max and sort surjections. The reviewers found the paper insightful and praised the quality of exposition. They thought the experiments were sufficient to demonstrate that the proposed components work, even if they did not always lead to an improvement over the existing architectures. By describing probabilistic modules in terms of the forward and backward mappings as well as their contributions to the log-likelihood (or the ELBO), the paper provides a clear template for implementations that enable easy composability. This could be an influential contribution. Similarly, the idea of surjective components along with the provided examples might spur derivation of more such modules, expanding the range of easy-to-use primitives for probabilistic modeling.