An Evaluation of Non-Contact Photoplethysmography-Based Methods for Remote Respiratory Rate Estimation

Sensors (Basel). 2023 Mar 23;23(7):3387. doi: 10.3390/s23073387.

Abstract

The respiration rate (RR) is one of the physiological signals deserving monitoring for assessing human health and emotional states. However, traditional devices, such as the respiration belt to be worn around the chest, are not always a feasible solution (e.g., telemedicine, device discomfort). Recently, novel approaches have been proposed aiming at estimating RR in a less invasive yet reliable way, requiring the acquisition and processing of contact or remote Photoplethysmography (contact reference and remote-PPG, respectively). The aim of this paper is to address the lack of systematic evaluation of proposed methods on publicly available datasets, which currently impedes a fair comparison among them. In particular, we evaluate two prominent families of PPG processing methods estimating Respiratory Induced Variations (RIVs): the first encompasses methods based on the direct extraction of morphological features concerning the RR; and the second group includes methods modeling respiratory artifacts adopting, in the most promising cases, single-channel blind source separation. Extensive experiments have been carried out on the public BP4D+ dataset, showing that the morphological estimation of RIVs is more reliable than those produced by a single-channel blind source separation method (both in contact and remote testing phases), as well as in comparison with a representative state-of-the-art Deep Learning-based approach for remote respiratory information estimation.

Keywords: contactless respiration monitoring; empirical mode decomposition; incremental merge segmentation; pyVHR; remote photoplethysmography; remote respiratory rate estimation; singular spectrum analysis; vital signs from video.

MeSH terms

  • Algorithms*
  • Heart Rate / physiology
  • Humans
  • Photoplethysmography / methods
  • Respiratory Rate / physiology
  • Signal Processing, Computer-Assisted*

Grants and funding

This research received no external funding.