NeurIPS 2020
### Path Sample-Analytic Gradient Estimators for Stochastic Binary Networks

### Meta Review

The paper proposes a new gradient estimator (PSA) for stochastic binary networks, based on combining the local expectation gradients (LEG) idea with a linear approximation. PSA scales linearly in the network size, and thus is much faster than LEG. While the linearization step means that the PSA gradient is biased, the empirical analysis on a small network shows that for almost any reasonable computational budget PSA has the lowest the overall RMSE in the gradient estimate out of several biased and unbiased estimators.
The paper is well written and the proposed method is clearly described. The algorithm is simple and might become a popular alternative to the less accurate straight-through estimator. Overall, a solid contribution to the gradient estimation literature.