NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:7848
Title:Universal Approximation of Input-Output Maps by Temporal Convolutional Nets


		
The first contribution of this paper is Theorem 3.1 that shows that ReLU TCNs (temporal convolutional networks) can arbitrarily approximate any causal, time-invariant map. Moreover, it provides quantitative bounds the properties of the approximating network (context length, width and depth) in terms of the continuity parameters of the map being approximated. The second contribution is Theorem 4.1 which gives incremental stability conditions on state-space maps (or recurrent models) such that the i/o maps they induce satisfy the assumptions of Theorem 3.1. The third contribution (Theorem 4.2) is of the same kind as Theorem 4.1 but now under an exponential incremental stability condition along with a stability criterion from nonlinear systems theory called Demidovich criterion. I agree with R1 that an additional strength of the paper is in drawing connections between dynamical systems theory and recurrent models. R2 had concerns about the significance of Theorem 3.1 in light of [Hanin and Sellke, 2018]. However, in the discussions R1 pointed out that the simplicity of the proof is due to the work on the part of the authors in setting up the framework and definitions so that the conclusions follows easily. Such simplicity is deceptive and usually takes work to achieve. I agree with R1's sentiment. The authors should revise the manuscript to make the changes the reviewers have suggested. In particular, all typos etc. should be fixed before the final version is submitted.