Neural Relation Extraction with Selective Attention over Instances

Yankai Lin,Shiqi Shen,Zhiyuan Liu,Huanbo Luan,Maosong Sun

Published 2016 in Annual Meeting of the Association for Computational Linguistics

ABSTRACT

Distant supervised relation extraction has been widely used to find novel relational facts from text. However, distant supervision inevitably accompanies with the wrong labelling problem, and these noisy data will substantially hurt the performance of relation extraction. To alleviate this issue, we propose a sentence-level attention-based model for relation extraction. In this model, we employ convolutional neural networks to embed the semantics of sentences. Afterwards, we build sentence-level attention over multiple instances, which is expected to dynamically reduce the weights of those noisy instances. Experimental results on real-world datasets show that, our model can make full use of all informative sentences and effectively reduce the influence of wrong labelled instances. Our model achieves significant and consistent improvements on relation extraction as compared with baselines. The source code of this paper can be obtained from https: //github.com/thunlp/NRE .

PUBLICATION RECORD

  • Publication year

    2016

  • Venue

    Annual Meeting of the Association for Computational Linguistics

  • Publication date

    2016-08-01

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-25 of 25 references · Page 1 of 1

CITED BY

Showing 1-100 of 1012 citing papers · Page 1 of 11