NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This paper is a genuine borderline case. The work presents an algorithmic advance on an application and provides empirical evidence of the improved performance. Many aspects of this are very well done: the problem is well explained and well motivated; the prior work is well explained and placed in a proper context; the conclusions are measured and humble. The work is solid, but in fact it could be more solid; the results could be improved in some significant and plain-enough ways. In short, the paper provides significant evidence of an algorithmic advance, but not the best evidence. The new algorithm also only ties the previous best algorithm rather than beating it. So we have real science, but not the most exciting science nor the gold standard of evaluation. I would have thought that such a thing would be appropriate for a timely conference publication, but perhaps NeurIPS is now more archival than that. The view expressed above is based on the three reviews, a bit of discussion after the author response, and my own complete reading through of the paper (after which I was independently tending toward accept). The three reviews are not greatly different from each other in their views of the paper. The most negative review was borderline reject, because the work was thought to be somewhat incremental. The middle review was nominally borderline reject as well, but the reviewer indicated that they would cross over to accept for the expected revision. The most positive review was a clear accept, and that reviewer argued that the advance was more than incremental. However, this reviewer did not make a strong positive case overall for acceptance. So this paper is in fact a borderline case. After further consideration from the Program Chairs, given the AC's own leaning towards accepting, an Accept (Poster) recommendation was settled on. [This meta-review was reviewed and revised by the Program Chairs.]