Limitations and Future Aspects of Communication Costs in Federated Learning: A Survey

Sensors (Basel). 2023 Aug 23;23(17):7358. doi: 10.3390/s23177358.

Abstract

This paper explores the potential for communication-efficient federated learning (FL) in modern distributed systems. FL is an emerging distributed machine learning technique that allows for the distributed training of a single machine learning model across multiple geographically distributed clients. This paper surveys the various approaches to communication-efficient FL, including model updates, compression techniques, resource management for the edge and cloud, and client selection. We also review the various optimization techniques associated with communication-efficient FL, such as compression schemes and structured updates. Finally, we highlight the current research challenges and discuss the potential future directions for communication-efficient FL.

Keywords: client selection; communication efficient; federated learning; model compression; resource management; structured updates.

Publication types

  • Review

Grants and funding

These research results were obtained from research commissioned by the National Institute of Information and Communications Technology (NICT), JAPAN.