Learning text representation using recurrent convolutional neural network with highway layers

Ying Wen,Weinan Zhang,Rui Luo,Jun Wang

Published 2016 in Annual International ACM SIGIR Conference on Research and Development in Information Retrieval

ABSTRACT

Recently, the rapid development of word embedding and neural networks has brought new inspiration to various NLP and IR tasks. In this paper, we describe a staged hybrid model combining Recurrent Convolutional Neural Networks (RCNN) with highway layers. The highway network module is incorporated in the middle takes the output of the bi-directional Recurrent Neural Network (Bi-RNN) module in the first stage and provides the Convolutional Neural Network (CNN) module in the last stage with the input. The experiment shows that our model outperforms common neural network models (CNN, RNN, Bi-RNN) on a sentiment analysis task. Besides, the analysis of how sequence length influences the RCNN with highway layers shows that our model could learn good representation for the long text.

PUBLICATION RECORD

  • Publication year

    2016

  • Venue

    Annual International ACM SIGIR Conference on Research and Development in Information Retrieval

  • Publication date

    2016-06-22

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-18 of 18 references · Page 1 of 1

CITED BY

Showing 1-55 of 55 citing papers · Page 1 of 1