Nested relation extraction via self-contrastive learning guided by structure and semantic similarity

Neural Netw. 2023 May:162:393-411. doi: 10.1016/j.neunet.2023.03.001. Epub 2023 Mar 4.

Abstract

The conventional Relation Extraction (RE) task involves identifying whether relations exist between two entities in a given sentence and determining their relation types. However, the complexity of practical application scenarios and the flexibility of natural language demand the ability to extract nested relations, i.e., the recognized relation triples may be components of the higher-level relations. Previous studies have highlighted several challenges that affect the nested RE task, including the lack of abundant labeled data, inappropriate neural networks, and underutilization of the nested relation structures. To address these issues, we formalize the nested RE task and propose a hierarchical neural network to iteratively identify the nested relations between entities and relation triples in a layer by layer manner. Moreover, a novel self-contrastive learning optimization strategy is presented to adapt our method to low-data settings by fully exploiting the constraints due to the nested structure and semantic similarity between paired input sentences. Our method outperformed the state-of-the-art baseline methods in extensive experiments, and ablation experiments verified the effectiveness of the proposed self-contrastive learning optimization strategy.

Keywords: Iterative neural networks; Nested relation extraction; Self-contrastive learning; Semantic similarity; Structure similarity.

MeSH terms

  • Language*
  • Learning
  • Natural Language Processing
  • Neural Networks, Computer
  • Semantics*