Training-Free Deep Generative Networks for Compressed Sensing of Neural Action Potentials

IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5190-5199. doi: 10.1109/TNNLS.2021.3069436. Epub 2022 Oct 5.

Abstract

Energy consumption is an important issue for resource-constrained wireless neural recording applications with limited data bandwidth. Compressed sensing (CS) is a promising framework for addressing this challenge because it can compress data in an energy-efficient way. Recent work has shown that deep neural networks (DNNs) can serve as valuable models for CS of neural action potentials (APs). However, these models typically require impractically large datasets and computational resources for training, and they do not easily generalize to novel circumstances. Here, we propose a new CS framework, termed APGen, for the reconstruction of APs in a training-free manner. It consists of a deep generative network and an analysis sparse regularizer. We validate our method on two in vivo datasets. Even without any training, APGen outperformed model-based and data-driven methods in terms of reconstruction accuracy, computational efficiency, and robustness to AP overlap and misalignment. The computational efficiency of APGen and its ability to perform without training make it an ideal candidate for long-term, resource-constrained, and large-scale wireless neural recording. It may also promote the development of real-time, naturalistic brain-computer interfaces.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Action Potentials / physiology
  • Neural Networks, Computer*