Sparse sign-consistent Johnson-Lindenstrauss matrices: compression with neuroscience-based constraints

Proc Natl Acad Sci U S A. 2014 Nov 25;111(47):16872-6. doi: 10.1073/pnas.1419100111. Epub 2014 Nov 10.

Abstract

Johnson-Lindenstrauss (JL) matrices implemented by sparse random synaptic connections are thought to be a prime candidate for how convergent pathways in the brain compress information. However, to date, there is no complete mathematical support for such implementations given the constraints of real neural tissue. The fact that neurons are either excitatory or inhibitory implies that every so implementable JL matrix must be sign consistent (i.e., all entries in a single column must be either all nonnegative or all nonpositive), and the fact that any given neuron connects to a relatively small subset of other neurons implies that the JL matrix should be sparse. We construct sparse JL matrices that are sign consistent and prove that our construction is essentially optimal. Our work answers a mathematical question that was triggered by earlier work and is necessary to justify the existence of JL compression in the brain and emphasizes that inhibition is crucial if neurons are to perform efficient, correlation-preserving compression.

Keywords: Johnson–Lindenstrauss compression; sign-consistent matrices; synaptic-connectivity matrices.

MeSH terms

  • Models, Biological*
  • Neurosciences*