Part of Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track
Dan Mikulincer, Daniel Reichman
Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with n points, and the goal is to find a size and depth efficient monotone neural network with \emph{non negative parameters} and threshold units that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth 2. On the other hand, we prove that for every monotone data set with n points in Rd, there exists an interpolating monotone network of depth 4 and size O(nd). Our interpolation result implies that every monotone function over [0,1]d can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth d+1. Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.