Part of Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
Frederik Schmitt, Christopher Hahn, Markus N Rabe, Bernd Finkbeiner
We train hierarchical Transformers on the task of synthesizing hardware circuits directly out of high-level logical speciﬁcations in linear-time temporal logic (LTL). The LTL synthesis problem is a well-known algorithmic challenge with a long history and an annual competition is organized to track the improvement of algorithms and tooling over time. New approaches using machine learning might open a lot of possibilities in this area, but suffer from the lack of sufﬁcient amounts of training data. In this paper, we consider a method to generate large amounts of additional training data, i.e., pairs of speciﬁcations and circuits implementing them. We ensure that this synthetic data is sufﬁciently close to human-written speciﬁcations by mining common patterns from the speciﬁcations used in the synthesis competitions. We show that hierarchical Transformers trained on this synthetic data solve a signiﬁcant portion of problems from the synthesis competitions, and even out-of-distribution examples from a recent case study.