Dependency-Based Word Embeddings

Omer Levy,Yoav Goldberg

Published 2014 in Annual Meeting of the Association for Computational Linguistics

ABSTRACT

While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. In particular, we perform experiments with dependency-based contexts, and show that they produce markedly different embeddings. The dependencybased embeddings are less topical and exhibit more functional similarity than the original skip-gram embeddings.

PUBLICATION RECORD

  • Publication year

    2014

  • Venue

    Annual Meeting of the Association for Computational Linguistics

  • Publication date

    2014-06-01

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-32 of 32 references · Page 1 of 1

CITED BY

Showing 1-100 of 1180 citing papers · Page 1 of 12