Induction of Finite-State Automata Using Second-Order Recurrent Networks

Part of Advances in Neural Information Processing Systems 4 (NIPS 1991)

Bibtex Metadata Paper

Authors

Raymond Watrous, Gary Kuhn

Abstract

Second-order recurrent networks that recognize simple finite state lan(cid:173) guages over {0,1}* are induced from positive and negative examples. Us(cid:173) ing the complete gradient of the recurrent network and sufficient training examples to constrain the definition of the language to be induced, solu(cid:173) tions are obtained that correctly recognize strings of arbitrary length. A method for extracting a finite state automaton corresponding to an opti(cid:173) mized network is demonstrated.