Part of Advances in Neural Information Processing Systems 3 (NIPS 1990)
Sanjay Biswas, Santosh Venkatesh
Robustness is a commonly bruited property of neural networks; in particu(cid:173) lar, a folk theorem in neural computation asserts that neural networks-in contexts with large interconnectivity-continue to function efficiently, al(cid:173) beit with some degradation, in the presence of component damage or loss. A second folk theorem in such contexts asserts that dense interconnectiv(cid:173) ity between neural elements is a sine qua non for the efficient usage of resources. These premises are formally examined in this communication in a setting that invokes the notion of the "devil" 1 in the network as an agent that produces sparsity by snipping connections.
1 ON REMOVING THE FOLK FROM THE THEOREM
Robustness in the presence of component damage is a property that is commonly attributed to neural networks. The content of the following statement embodies this sentiment.
Folk Theorem 1: Computation in neural networks is not substantially affected by damage to network components.
While such a statement is manifestly not true in general-witness networks with "grandmother cells" where damage to the critical cells fatally impairs the com(cid:173) putational ability of the network-there is anecdotal evidence in support of it in
1 Well, maybe an imp.