Dependency Based Embeddings for Sentence Classification Tasks

Alexandros Komninos,S. Manandhar

Published 2016 in North American Chapter of the Association for Computational Linguistics

ABSTRACT

We compare different word embeddings from a standard window based skipgram model, a skipgram model trained using dependency context features and a novel skipgram variant that utilizes additional information from dependency graphs. We explore the effectiveness of the different types of word embeddings for word similarity and sentence classification tasks. We consider three common sentence classification tasks: question type classification on the TREC dataset, binary sentiment classification on Stanford’s Sentiment Treebank and semantic relation classification on the SemEval 2010 dataset. For each task we use three different classification methods: a Support Vector Machine, a Convolutional Neural Network and a Long Short Term Memory Network. Our experiments show that dependency based embeddings outperform standard window based embeddings in most of the settings, while using dependency context embeddings as additional features improves performance in all tasks regardless of the classification method. Ourembeddings and code are available at https:

PUBLICATION RECORD

  • Publication year

    2016

  • Venue

    North American Chapter of the Association for Computational Linguistics

  • Publication date

    2016-06-01

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-31 of 31 references · Page 1 of 1

CITED BY

Showing 1-100 of 153 citing papers · Page 1 of 2