Simple task-specific bilingual word embeddings

Stephan Gouws,Anders Søgaard

Published 2015 in North American Chapter of the Association for Computational Linguistics

ABSTRACT

We introduce a simple wrapper method that uses off-the-shelf word embedding algorithms to learn task-specific bilingual word embeddings. We use a small dictionary of easily-obtainable task-specific word equivalence classes to produce mixed context-target pairs that we use to train off-the-shelf embedding models. Our model has the advantage that it (a) is independent of the choice of embedding algorithm, (b) does not require parallel data, and (c) can be adapted to specific tasks by re-defining the equivalence classes. We show how our method outperforms off-the-shelf bilingual embeddings on the task of unsupervised cross-language partof-speech (POS) tagging, as well as on the task of semi-supervised cross-language super sense (SuS) tagging.

PUBLICATION RECORD

  • Publication year

    2015

  • Venue

    North American Chapter of the Association for Computational Linguistics

  • Publication date

    Unknown publication date

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-20 of 20 references · Page 1 of 1

CITED BY

Showing 1-100 of 127 citing papers · Page 1 of 2