DSFedCon: Dynamic Sparse Federated Contrastive Learning for Data-Driven Intelligent Systems

IEEE Trans Neural Netw Learn Syst. 2024 Jan 26:PP. doi: 10.1109/TNNLS.2024.3349400. Online ahead of print.

Abstract

Federated learning (FL) makes it possible for multiple clients to collaboratively train a machine-learning model through communicating models instead of data, reducing privacy risk. Thus, FL is more suitable for processing data security and privacy for intelligent systems and applications. Unfortunately, there are several challenges in FL, such as the low training accuracy for nonindependent and identically distributed (non-IID) data and the high cost of computation and communication. Considering these, we propose a novel FL framework named dynamic sparse federated contrastive learning (DSFedCon). DSFedCon combines FL with dynamic sparse (DSR) training of network pruning and contrastive learning to improve model performance and reduce computation costs and communication costs. We analyze DSFedCon from the perspective of accuracy, communication, and security, demonstrating it is communication-efficient and safe. To give a practical evaluation for non-IID data training, we perform experiments and comparisons on the MNIST, CIFAR-10, and CIFAR-100 datasets with different parameters of Dirichlet distribution. Results indicate that DSFedCon can get higher accuracy and better communication cost than other state-of-the-art methods in these two datasets. More precisely, we show that DSFedCon has a 4.67-time speedup of communication rounds in MNIST, a 7.5-time speedup of communication rounds in CIFAR-10, and an 18.33-time speedup of communication rounds in CIFAR-100 dataset while achieving the same training accuracy.