Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)
Xavier Carreras, Lluís Màrquez
This work presents an architecture based on perceptrons to recognize phrase structures, and an online learning algorithm to train the percep- trons together and dependently. The recognition strategy applies learning in two layers: a ﬁltering layer, which reduces the search space by identi- fying plausible phrase candidates, and a ranking layer, which recursively builds the optimal phrase structure. We provide a recognition-based feed- back rule which reﬂects to each local function its committed errors from a global point of view, and allows to train them together online as percep- trons. Experimentation on a syntactic parsing problem, the recognition of clause hierarchies, improves state-of-the-art results and evinces the advantages of our global training method over optimizing each function locally and independently.