Distant supervision for relation extraction with hierarchical selective attention

Neural Netw. 2018 Dec:108:240-247. doi: 10.1016/j.neunet.2018.08.016. Epub 2018 Aug 29.

Abstract

Distant supervised relation extraction is an important task in the field of natural language processing. There are two main shortcomings for most state-of-the-art methods. One is that they take all sentences of an entity pair as input, which would result in a large computational cost. But in fact, few of most relevant sentences are enough to recognize the relation of an entity pair. To tackle these problems, we propose a novel hierarchical selective attention network for relation extraction under distant supervision. Our model first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level attention to construct sentence representations and fine sentence-level attention to aggregate these sentence representations. Experimental results on a widely used dataset demonstrate that our method performs significantly better than most of existing methods.

Keywords: Distant supervision; Hierarchical attention; Piecewise convolutional neural networks; Relation extraction.

MeSH terms

  • Attention* / physiology
  • Databases, Factual / trends
  • Humans
  • Language
  • Natural Language Processing*