NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:1958
Title:LCA: Loss Change Allocation for Neural Network Training


		
There is some disagreement about this paper among reviewers. There is a common appreciation for this line of study and specifically the new loss contribution (LC) metric proposed. As many things about the training process of DNNs remains "mysterious", developing new and better "lenses" through which we can look at the inner workings of a DNN can be of great value for the field. The criticism in the less enthusiastic reviews is largely around "more effort": comparison to other approaches, more experiments, clarifications and improvements, making it more actionable. One can also give that a positive spin: there is a lot of interesting follow-up work to be done here. I want to give the paper the benefit of the doubt, encourage the authors to work on a revision that addresses some of the holes spotted by the reviewers (the rebuttal is already a great step in that direction).