Empirical Risk Minimization with Approximations of Probabilistic Grammars

Part of Advances in Neural Information Processing Systems 23 (NIPS 2010)

Bibtex »Metadata »Paper »Supplemental »

Authors

Noah A. Smith, Shay Cohen

Abstract

<p>Probabilistic grammars are generative statistical models that are useful for compositional and sequential structures. We present a framework, reminiscent of structural risk minimization, for empirical risk minimization of the parameters of a fixed probabilistic grammar using the log-loss. We derive sample complexity bounds in this framework that apply both to the supervised setting and the unsupervised setting.</p>