Analysis of Privacy-Enhancing Technologies in Open-Source Federated Learning Frameworks for Driver Activity Recognition

Sensors (Basel). 2022 Apr 13;22(8):2983. doi: 10.3390/s22082983.

Abstract

Wearable devices and smartphones that are used to monitor the activity and the state of the driver collect a lot of sensitive data such as audio, video, location and even health data. The analysis and processing of such data require observing the strict legal requirements for personal data security and privacy. The federated learning (FL) computation paradigm has been proposed as a privacy-preserving computational model that allows securing the privacy of the data owner. However, it still has no formal proof of privacy guarantees, and recent research showed that the attacks targeted both the model integrity and privacy of the data owners could be performed at all stages of the FL process. This paper focuses on the analysis of the privacy-preserving techniques adopted for FL and presents a comparative review and analysis of their implementations in the open-source FL frameworks. The authors evaluated their impact on the overall training process in terms of global model accuracy, training time and network traffic generated during the training process in order to assess their applicability to driver's state and behaviour monitoring. As the usage scenario, the authors considered the case of the driver's activity monitoring using the data from smartphone sensors. The experiments showed that the current implementation of the privacy-preserving techniques in open-source FL frameworks limits the practical application of FL to cross-silo settings.

Keywords: differential privacy; driver activity monitoring; federated learning; homomorphic encryption; open-source federated learning frameworks; privacy; secure multi-party computations.

MeSH terms

  • Computer Security*
  • Learning
  • Privacy*
  • Recognition, Psychology
  • Technology