How to Choose an Activation Function

Part of Advances in Neural Information Processing Systems 6 (NIPS 1993)

Bibtex Metadata Paper


H. N. Mhaskar, C. A.. Micchelli


We study the complexity problem in artificial feedforward neural networks designed to approximate real valued functions of several real variables; i.e., we estimate the number of neurons in a network required to ensure a given degree of approximation to every function in a given function class. We indicate how to construct networks with the indicated number of neurons evaluating standard activation functions. Our general theorem shows that the smoother the activation function, the better the rate of approximation.