NeurIPS 2020

On the Tightness of Semidefinite Relaxations for Certifying Robustness to Adversarial Examples


Meta Review

Thank you for your submission to NeurIPS. The reviewers are generally in complete agreement that the paper provides an interesting result, that the SDP relaxation for ReLU networks is tight for one hidden layer. There was still some disagreement between the reviewers on how significant a result this truly is, though the original reviewers also had fairly low confidence in the assessment. Due to this, I solicited an additional expert review after the rebuttal and the general opinion was still the same (though the reviewer pointed out some missing literature that, while it doesn't limit the applicability of the result at all, still should definitely be mentioned in the paper). Please address these comments in the camera ready version.