NeurIPS 2020

Efficient Learning of Generative Models via Finite-Difference Score Matching

Meta Review

The authors reformulate denoising score matching and sliced score matching in terms of directional derivatives (1st and 2nd order respectively) and show that estimating these derivatives using finite differences (FD) leads to faster training and lower memory usage while yielding comparable results to using exact derivatives. The paper also introduces a general approach to estimating directional derivatives of any order using FDs of the function. The reviewers found the paper interesting and very well written. They also appreciated the extensive evaluation of the algorithms on a range of different generative models as well as the clarifications provided in the rebuttal. There were some concerns however about the novelty as well as the necessity of the developed FD estimation approach for directional derivatives. The authors are encouraged to perform a more extensive literature search as well as explain why applying the standard FD machinery to the definition d/dvL(x) = lim_h->0 (L(x+hv)-L(x-hv))/(2h) (and its higher order generalizations) is insufficient to achieve the aims of the paper.