Relational labeling unlocks inert knowledge

Cognition. 2020 Mar:196:104146. doi: 10.1016/j.cognition.2019.104146. Epub 2019 Nov 30.

Abstract

Insightful solutions often come about by recalling a relevant prior situation-one that shares the same essential relational pattern as the current problem. Unfortunately, our memory retrievals often depend primarily on surface matches, rather than relational matches. For example, a person who is familiar with the idea of positive feedback in sound systems may fail to think of it in the context of global warming. We suggest that one reason for the failure of cross-domain relational retrieval is that relational information is typically encoded variably, in a context-dependent way. In contrast, the surface features of that context-such as objects, animals and characters-are encoded in a relatively stable way, and are therefore easier to retrieve across contexts. We propose that the use of relational language can serve to make situations' relational representations more uniform, thereby facilitating relational retrieval. In two studies, we find that providing relational labels for situations at encoding or at retrieval increased the likelihood of relational retrieval. In contrast, domain labels-labels that highlight situations' contextual features-did not reliably improve domain retrieval. We suggest that relational language allows people to retrieve knowledge that would otherwise remain inert and contributes to domain experts' insight.

Keywords: Analogy; Creativity; Memory retrieval; Relational reasoning; Relational transfer.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Memory*
  • Mental Recall*