Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)
Adam Kowalczyk
The feed-forward networks with fixed hidden units (FllU-networks) are compared against the category of remaining feed-forward net(cid:173) works with variable hidden units (VHU-networks). Two broad classes of tasks on a finite domain X C R n are considered: ap(cid:173) proximation of every function from an open subset of functions on X and representation of every dichotomy of X. For the first task it is found that both network categories require the same minimal number of synaptic weights. For the second task and X in gen(cid:173) eral position it is shown that VHU-networks with threshold logic hidden units can have approximately lin times fewer hidden units than any FHU-network must have.