Learning Generic Sentence Representations Using Convolutional Neural Networks

Zhe Gan,Yunchen Pu,Ricardo Henao,Chunyuan Li,Xiaodong He,L. Carin

Published 2016 in Conference on Empirical Methods in Natural Language Processing

ABSTRACT

We propose a new encoder-decoder approach to learn distributed sentence representations that are applicable to multiple purposes. The model is learned by using a convolutional neural network as an encoder to map an input sentence into a continuous vector, and using a long short-term memory recurrent neural network as a decoder. Several tasks are considered, including sentence reconstruction and future sentence prediction. Further, a hierarchical encoder-decoder model is proposed to encode a sentence to predict multiple future sentences. By training our models on a large collection of novels, we obtain a highly generic convolutional sentence encoder that performs well in practice. Experimental results on several benchmark datasets, and across a broad range of applications, demonstrate the superiority of the proposed model over competing methods.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-58 of 58 references · Page 1 of 1

CITED BY

Showing 1-100 of 101 citing papers · Page 1 of 2