Part of Advances in Neural Information Processing Systems 1 (NIPS 1988)
Bahram Nabet, Robert Darling, Robert Pinter
An extremely compact, all analog and fully parallel implementa(cid:173) tion of a class of shunting recurrent neural networks that is ap(cid:173) plicable to a wide variety of FET-based integration technologies is proposed. While the contrast enhancement, data compression, and adaptation to mean input intensity capabilities of the network are well suited for processing of sensory information or feature extrac(cid:173) tion for a content addressable memory (CAM) system, the network also admits a global Liapunov function and can thus achieve stable CAM storage itself. In addition the model can readily function as a front-end processor to an analog adaptive resonance circuit.