A Two-Stage Differential Privacy Scheme for Federated Learning Based on Edge Intelligence

IEEE J Biomed Health Inform. 2023 Aug 18:PP. doi: 10.1109/JBHI.2023.3306425. Online ahead of print.

Abstract

The issue of data privacy protection must be considered in distributed federated learning (FL) so as to ensure that sensitive information is not leaked. In this paper, we propose a two-stage differential privacy (DP) framework for FL based on edge intelligence. Various levels of privacy preservation can be provided according to the degree of data sensitivity. In the first stage, the randomized response mechanism is used to perturb the original feature data by the user terminal for data desensitization, and the user can self-regulate the level of privacy preservation. In the second stage, noise is added to the local models by the edge server to further guarantee the privacy of the models. Finally, the model updates are aggregated in the cloud. In order to evaluate the performance of the proposed end-edge-cloud FL framework in terms of training accuracy and convergence, extensive experiments are conducted on a real electrocardiogram (ECG) signal dataset. Bi-directional long-short-term memory (BiLSTM) neural network is adopted to training classification model. The effect of different combinations of feature perturbation and noise addition on the model accuracy is analyzed depending on different privacy budgets and parameters. The experimental results demonstrate that the proposed privacy-preserving framework provides good accuracy and convergence while ensuring privacy.