Dominating Set Model Aggregation for communication-efficient decentralized deep learning

Neural Netw. 2024 Mar:171:25-39. doi: 10.1016/j.neunet.2023.11.057. Epub 2023 Nov 27.

Abstract

Decentralized deep learning algorithms leverage peer-to-peer communication of model parameters and/or gradients over communication graphs among the learning agents with access to their private data sets. The majority of the studies in this area focus on achieving high accuracy, with many at the expense of increased communication overhead among the agents. However, large peer-to-peer communication overhead often becomes a practical challenge, especially in harsh environments such as for an underwater sensor network. In this paper, we aim to reduce communication overhead while achieving similar performance as the state-of-the-art algorithms. To achieve this, we use the concept of Minimum Connected Dominating Set from graph theory that is applied in ad hoc wireless networks to address communication overhead issues. Specifically, we propose a new decentralized deep learning algorithm called minimum connected Dominating Set Model Aggregation (DSMA). We investigate the efficacy of our method for different communication graph topologies with a small to large number of agents using varied neural network model architectures. Empirical results on benchmark data sets show a significant (up to 100X) reduction in communication time while preserving the accuracy or in some cases, increasing it compared to the state-of-the-art methods. We also present an analysis to show the convergence of our proposed algorithm.

Keywords: Communication-efficient; Connected dominating set; Convergence rate; Decentralized learning; Distributed deep learning.

MeSH terms

  • Algorithms
  • Communication
  • Deep Learning*
  • Neural Networks, Computer