NeurIPS 2020

ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse Coding

Meta Review

Four knowledgeable reviewers support acceptance for the contributions. Reviewers find that i) using sparse coding to solve the gap issue in NAS is novel and promising. The formulation and notations are neat. ii) the one-stage framework makes the overall method unified. There is also a performance improvement in the one-stage framework. iii) the experimental results are also quite competitive. iv) Also, the convergence speed is faster than previous methods. V) the paper is well-organized and easy to understand. Therefore, I also recommend acceptance. However, please consider revising your paper to address all the concerns and comments from the reviewers. Specially, R6 asked to clarify the claim that Aj is sampled and fixed during training.