Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This work considers similarity-preserving objective functions for learning to classify inputs with a temporal dimension. The authors propose a modification of the Lie algebra formulation of Ruderman and Rao, where the algorithm maximizes the similarity of transformation of inputs that are nearby in time rather than comparing inputs at the same time directly. While the scores given were worthy of acceptance, the enthusiasm of reviewers both in the body of the reviews and in the discussion was somewhat muted. My impression is that there were two main reasons for this. a) the difference between the proposed approach and competing accounts (e.g. in Salazar-Gatzimas et al.) is not explained sufficiently, making it difficult to assess novelty (although this has been addressed in the rebuttal, and that material should be moved to the paper itself) and b) the extent to which this model accounts for natural data better than other models (as indicated by a pure goodness-of-fit measure or prediction accuracy rather than robustness to noise, or theoretical arguments) is unclear. Thus, while I see no reason to contradict the recommendation of the reviewers that the paper be accepted, we expect the reviewers to address these points (and the clarity of the paper in general) in the camera ready version of the paper.