Grammar as a Foreign Language

O. Vinyals,Lukasz Kaiser,Terry Koo,Slav Petrov,I. Sutskever,Geoffrey E. Hinton

Published 2014 in Neural Information Processing Systems

ABSTRACT

Syntactic constituency parsing is a fundamental problem in natural language processing and has been the subject of intensive research and engineering for decades. As a result, the most accurate parsers are domain specific, complex, and inefficient. In this paper we show that the domain agnostic attention-enhanced sequence-to-sequence model achieves state-of-the-art results on the most widely used syntactic constituency parsing dataset, when trained on a large synthetic corpus that was annotated using existing parsers. It also matches the performance of standard parsers when trained only on a small human-annotated dataset, which shows that this model is highly data-efficient, in contrast to sequence-to-sequence models without the attention mechanism. Our parser is also fast, processing over a hundred sentences per second with an unoptimized CPU implementation.

PUBLICATION RECORD

  • Publication year

    2014

  • Venue

    Neural Information Processing Systems

  • Publication date

    2014-12-23

  • Fields of study

    Mathematics, Linguistics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-35 of 35 references · Page 1 of 1

CITED BY

Showing 1-100 of 938 citing papers · Page 1 of 10