Part of Advances in Neural Information Processing Systems 32 (NeurIPS 2019)
Pavithra Prabhakar, Zahra Rahimi Afzal
In this paper, we consider the problem of output range analysis for feed-forward neural networks. The current approaches reduce the problem to satisfiability and optimization solving which are NP-hard problems, and whose computational complexity increases with the number of neurons in the network. We present a novel abstraction technique that constructs a simpler neural network with fewer neurons, albeit with interval weights called interval neural network (INN) which over-approximates the output range of the given neural network. We reduce the output range analysis on the INNs to solving a mixed integer linear programming problem. Our experimental results highlight the trade-off between the computation time and the precision of the computed output range.