NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5398
Title:KNG: The K-Norm Gradient Mechanism

Reviewer 1


		
I find the results of this paper interesting both in the theoretical bounds they allow to achieve and in the practical applicability of the proposed mechanism to multiple classical statistical problems. Even if the main result (Theorem 3.2) is rather simple, it has important consequences both theoretically and experimentally, as shown in the rest of the paper. I find also interesting to see that the considered class of functions allow one to go beyond what the general exponential mechanism can achieve. Finally, it is interesting to see the relations between this method and objetive perturbation. One aspect that I find questionable in this work is the claim that it is introducing a new mechanism. The proposed mechanism can be seen as a particular instance of the exponential mechanism with a weight function based on the gradient of the objective function, rather than a new mechanism per se. Pros: -the paper identifies an interesting class of weighting functions for the exponential mechanism. -the proposed method can be applied to a quite large class of examples. Cons: -calling the proposed mechanism a "new" mechanism is questionable. Comments after rebuttal ---------------------------- Thanks for your answer. Concerning the "new" mechanism, I don't want to open up a controversy since this is really a minor point but some of the paper you mention, e.g. the K-norm paper, acknowledge that their mechanism are an instance of the exponential mechanism when they introduce them. I am not claiming that you should not give it a new name. I don't think that your contribution would be weaker by acknowledging this. More importantly, I encourage you to include the more technical discussion on the comparison with the exponential mechanism and the lower bound by Awan et al. in the paper.

Reviewer 2


		
This work presents and motivates the KNG mechanism, which stands somewhat as hybrid of the exponential mechanism and objective perturbation. It relies somewhat on assumptions about the task, but with much simpler assumptions than those needed by objective perturbation. This proposed method is also grounded in terms of an existing K-norm mechanism. Originality: the proposed KNG and its theoretical properties are novel. Quality: This work is exceptionally complete, with theory, empirical studies, case study applications, and relations to other approaches. Clarity: the work is clear and structured Significance: The work provides a significant algorithm, with compelling asymptotic behaviors on tasks without requiring as onerous assumptions as other methods.

Reviewer 3


		
I thank the authors for shedding some light on the separation between exponential mechanism and the proposed K-norm mechanism based on the lower bounds from Awan et al. ICML 2019. Please include a proper discussion on this topic in the updated version. ------- The paper introduces the K-norm gradient (KNG) mechanism as a way of achieving differential privacy. It is similar in spirit to the exponential mechanism; with the key difference is that the loss function in the exponential sampling is replaced by an appropriate norm of the gradient of the loss function. The authors analyze the asymptotic error of this KNG mechanism under some assumptions on the loss function, and argue that it could be better that the asymptotic error of the exponential mechanism in some cases. The paper also discusses specific instantiation of this KNG mechanism on statistical problems such as mean estimation, linear regression, and quantile estimation/regression. The results look correct and are well-presented. The paper however does not seem to have a standout technical idea or contribution. The results feel incremental and for a theoretical paper lacks the required technical novelty expected for NeurIPS.