Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Hanjun Dai, Rishabh Singh, Bo Dai, Charles Sutton, Dale Schuurmans
Discrete structures play an important role in applications
like program language modeling and software engineering.
Current approaches to predicting complex structures typically consider
autoregressive models for their tractability,
with some sacrifice in flexibility.
Energy-based models (EBMs) on the other hand offer a more flexible
and thus more powerful approach to modeling such distributions,
but require partition function estimation.
In this paper we propose \modelshort, a new algorithm for learning conditional
and unconditional EBMs for discrete structured data,
where parameter gradients are estimated using a learned sampler
that mimics local search.
We show that the energy function and sampler can be trained efficiently
via a new variational form of power iteration,
achieving a better trade-off between flexibility and tractability.
Experimentally, we show that learning local search leads to significant
improvements in challenging application domains.
Most notably, we present an energy model guided fuzzer for software testing
that achieves comparable performance
to well engineered fuzzing engines like libfuzzer.