Skeletonization: A Technique for Trimming the Fat from a Network via Relevance Assessment

Part of Advances in Neural Information Processing Systems 1 (NIPS 1988)

Bibtex Metadata Paper


Michael C. Mozer, Paul Smolensky


This paper proposes a means of using the knowledge in a network to determine the functionality or relevance of individual units, both for the purpose of understanding the network's behavior and improving its performance. The basic idea is to iteratively train the network to a cer(cid:173) tain performance criterion, compute a measure of relevance that identi(cid:173) fies which input or hidden units are most critical to performance, and automatically trim the least relevant units. This skeletonization tech(cid:173) nique can be used to simplify networks by eliminating units that con(cid:173) vey redundant information; to improve learning performance by first learning with spare hidden units and then trimming the unnecessary ones away, thereby constraining generalization; and to understand the behavior of networks in terms of minimal "rules."