NeurIPS 2020

NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity


Meta Review

The paper investigates ways to reduce the parameter footprint of high-performant normalizing flows in image and audio modelling. The paper points out that naively tying parameters across layers doesn't work well, and so it proposes a better-thought way of doing this. The paper tackles an important problem - reducing the parameter footprint of large flow models can have significant benefits both in production (reducing the computational and memory footprint of these models) and in research (making these models easier to train and experiment with). Since the paper's contribution is an engineering improvement motivated by intuitive arguments rather than a rigorous theoretical justification, the reviewers correctly point out that a rigorous experimental validation is needed. Originally the reviewers had concerns regarding the rigour and depth of the experimental evaluation, but the additional experiments presented in the rebuttal have convinced the reviewers that the experimental evaluation is sufficient. I would strongly encourage the authors to take to heart the reviewers' feedback when revising the paper, in particular regarding strengthening the experimental evaluation, and to include the additional experimental results in the revised version.