Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior

Open Mind (Camb). 2023 Sep 20:7:675-690. doi: 10.1162/opmi_a_00101. eCollection 2023.

Abstract

Human response times conform to several regularities including the Hick-Hyman law, the power law of practice, speed-accuracy trade-offs, and the Stroop effect. Each of these has been thoroughly modeled in isolation, but no account describes these phenomena as predictions of a unified framework. We provide such a framework and show that the phenomena arise as decoding times in a simple neural rate code with an entropy stopping threshold. Whereas traditional information-theoretic encoding systems exploit task statistics to optimize encoding strategies, we move this optimization to the decoder, treating it as a Bayesian ideal observer that can track transmission statistics as prior information during decoding. Our approach allays prominent concerns that applying information-theoretic perspectives to modeling brain and behavior requires complex encoding schemes that are incommensurate with neural encoding.

Keywords: Hick-Hyman law; information theory; power law of practice; rate coding; response times.