Visualizing attention zones in machine reading comprehension models

STAR Protoc. 2022 Jun 16;3(3):101481. doi: 10.1016/j.xpro.2022.101481. eCollection 2022 Sep 16.

Abstract

The attention mechanism plays an important role in the machine reading comprehension (MRC) model. Here, we describe a pipeline for building an MRC model with a pretrained language model and visualizing the effect of each attention zone in different layers, which can indicate the explainability of the model. With the presented protocol and accompanying code, researchers can easily visualize the relevance of each attention zone in the MRC model. This approach can be generalized to other pretrained language models. For complete details on the use and execution of this protocol, please refer to Cui et al. (2022).

Keywords: Bioinformatics; Cognitive Neuroscience; Computer sciences; Systems biology.

MeSH terms

  • Comprehension*
  • Language*