NeurIPS 2020

Optimal Approximation - Smoothness Tradeoffs for Soft-Max Functions

Meta Review

This paper studies trade-offs between approximation quality and smoothness of "soft"-max functions. The natural exponential function has optimal tradeoff between expected additive approximation and smoothness measured with respect to Renyi divergence but suboptimal when measured via L_p norms. The authors present a new the piecewise linear function which is the optimal one for norms. The paper also discusses many different applications of the new functions to mechanism design, sub modular optimisation, and deep learning. The reviewers found this to be an well-written and thorough paper on an important problem of broad interest in machine learning. I recommend this paper for acceptance.