Self-Training PCFG Grammars with Latent Annotations Across Languages

Zhongqiang Huang,M. Harper

Published 2009 in Conference on Empirical Methods in Natural Language Processing

ABSTRACT

We investigate the effectiveness of self-training PCFG grammars with latent annotations (PCFG-LA) for parsing languages with different amounts of labeled training data. Compared to Charniak's lexicalized parser, the PCFG-LA parser was more effectively adapted to a language for which parsing has been less well developed (i.e., Chinese) and benefited more from self-training. We show for the first time that self-training is able to significantly improve the performance of the PCFG-LA parser, a single generative parser, on both small and large amounts of labeled training data. Our approach achieves state-of-the-art parsing accuracies for a single parser on both English (91.5%) and Chinese (85.2%).

PUBLICATION RECORD

  • Publication year

    2009

  • Venue

    Conference on Empirical Methods in Natural Language Processing

  • Publication date

    2009-08-06

  • Fields of study

    Linguistics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-22 of 22 references · Page 1 of 1

CITED BY

Showing 1-100 of 104 citing papers · Page 1 of 2