NeurIPS 2020

Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction


Meta Review

The paper proposed a method for learning representation that tries to maximize inter-class incoherence by embedding the data into orthogonal subspaces for different classes. While the reviewers recognize that this is an interesting idea, framed in a principle informational theoretic way, and the empirical results are promising, there are issues with clarity of the presentation and the connection to previous work. There is also a concern that the theory does not account for properties of the feature extractor, which are crucial for the properties of the embedding space. Given the promising empirical result, my recommendation is a weak accept.