How Neural Nets Work

Part of Neural Information Processing Systems 0 (NIPS 1987)

Bibtex Metadata Paper

Authors

Alan Lapedes, Robert Farber

Abstract

There is presently great interest in the abilities of neural networks to mimic "qualitative reasoning" by manipulating neural incodings of symbols. Less work has been performed on using neural networks to process floating point numbers and it is sometimes stated that neural networks are somehow inherently inaccu(cid:173) rate and therefore best suited for "fuzzy" qualitative reasoning. Nevertheless, the potential speed of massively parallel operations make neural net "number crunching" an interesting topic to explore. In this paper we discuss some of our work in which we demonstrate that for certain applications neural networks can achieve significantly higher numerical accuracy than more conventional tech(cid:173) niques. In particular, prediction of future values of a chaotic time series can be performed with exceptionally high accuracy. We analyze how a neural net is able to do this , and in the process show that a large class of functions from Rn. ~ Rffl may be accurately approximated by a backpropagation neural net with just two "hidden" layers. The network uses this functional approximation to perform either interpolation (signal processing applications) or extrapolation (symbol processing applicationsJ. Neural nets therefore use quite familiar meth(cid:173) ods to perform. their tasks. The geometrical viewpoint advocated here seems to be a useful approach to analyzing neural network operation and relates neural networks to well studied topics in functional approximation.