A Review of Privacy Enhancement Methods for Federated Learning in Healthcare Systems

Int J Environ Res Public Health. 2023 Aug 7;20(15):6539. doi: 10.3390/ijerph20156539.

Abstract

Federated learning (FL) provides a distributed machine learning system that enables participants to train using local data to create a shared model by eliminating the requirement of data sharing. In healthcare systems, FL allows Medical Internet of Things (MIoT) devices and electronic health records (EHRs) to be trained locally without sending patients data to the central server. This allows healthcare decisions and diagnoses based on datasets from all participants, as well as streamlining other healthcare processes. In terms of user data privacy, this technology allows collaborative training without the need of sharing the local data with the central server. However, there are privacy challenges in FL arising from the fact that the model updates are shared between the client and the server which can be used for re-generating the client's data, breaching privacy requirements of applications in domains like healthcare. In this paper, we have conducted a review of the literature to analyse the existing privacy and security enhancement methods proposed for FL in healthcare systems. It has been identified that the research in the domain focuses on seven techniques: Differential Privacy, Homomorphic Encryption, Blockchain, Hierarchical Approaches, Peer to Peer Sharing, Intelligence on the Edge Device, and Mixed, Hybrid and Miscellaneous Approaches. The strengths, limitations, and trade-offs of each technique were discussed, and the possible future for these seven privacy enhancement techniques for healthcare FL systems was identified.

Keywords: P2PS; blockchain; differential privacy; edge device; edge federated learning; federated learning; homomorphic encryption; privacy enhancement.

Publication types

  • Review

MeSH terms

  • Blockchain*
  • Computer Communication Networks
  • Delivery of Health Care
  • Electronic Health Records
  • Humans
  • Privacy*

Grants and funding

This research received no external funding.