CACV-tree: A New Computational Approach for Sentence Similarity Modeling

Jingwei Wang,Wenxin Hu,Wen Wu

Published 2019 in Proceedings of the 2019 International Conference on Big Data Engineering

ABSTRACT

Sentence similarity modeling plays an important role in Natural Language Processing (NLP) tasks, and thus has received much attention. In recent years, due to the success of word embedding, the neural network method has achieved sentence embedding, obtaining attractive performance. Nevertheless, most of them focused on learning semantic information and modeling it as a continuous vector, while the syntactic information of sentences has not been fully exploited. On the other hand, prior works have shown the benefits of structured trees that include syntactic information, while few methods in this branch utilized the advantages of sentence compression. This paper makes the first attempt to absorb their advantages by merging these techniques in a unified structure, dubbed as CACV-tree (Compression Attention Constituency Vector-tree). The experimental results, based on 14 widely used datasets, demonstrate that our model is effective and competitive, compared against state-of-the-art models.

PUBLICATION RECORD

  • Publication year

    2019

  • Venue

    Proceedings of the 2019 International Conference on Big Data Engineering

  • Publication date

    2019-06-11

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-36 of 36 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1