Detection of Respiratory Infections Using RGB-Infrared Sensors on Portable Device

IEEE Sens J. 2020 Jun 24;20(22):13674-13681. doi: 10.1109/JSEN.2020.3004568. eCollection 2020 Nov 15.

Abstract

Coronavirus Disease 2019 (COVID-19) caused by severe acute respiratory syndrome coronaviruses 2 (SARS-CoV-2) has become a serious global pandemic in the past few months and caused huge loss to human society worldwide. For such a large-scale pandemic, early detection and isolation of potential virus carriers is essential to curb the spread of the pandemic. Recent studies have shown that one important feature of COVID-19 is the abnormal respiratory status caused by viral infections. During the pandemic, many people tend to wear masks to reduce the risk of getting sick. Therefore, in this paper, we propose a portable non-contact method to screen the health conditions of people wearing masks through analysis of the respiratory characteristics from RGB-infrared sensors. We first accomplish a respiratory data capture technique for people wearing masks by using face recognition. Then, a bidirectional GRU neural network with an attention mechanism is applied to the respiratory data to obtain the health screening result. The results of validation experiments show that our model can identify the health status of respiratory with 83.69% accuracy, 90.23% sensitivity and 76.31% specificity on the real-world dataset. This work demonstrates that the proposed RGB-infrared sensors on portable device can be used as a pre-scan method for respiratory infections, which provides a theoretical basis to encourage controlled clinical trials and thus helps fight the current COVID-19 pandemic. The demo videos of the proposed system are available at: https://doi.org/10.6084/m9.figshare.12028032.

Keywords: COVID-19 pandemic; SARS-CoV-2; deep learning; dual-mode tomography; health screening; recurrent neural network; respiratory state; thermal imaging.

Associated data

  • figshare/10.6084/m9.figshare.12028032

Grants and funding

Thiswork was supported in part by the National Natural Science Foundation of China under Grant 61901172, Grant 61831015, and Grant U1908210, in part by the Shanghai Sailing Program under Grant 19YF1414100, in part by the Science and Technology Commission of Shanghai Municipality (STCSM) under Grant 18DZ2270700 and Grant 19511120100, in part by the Foundation of Key Laboratory of Artificial Intelligence, Ministry of Education under Grant AI2019002, and in part by the Fundamental Research Funds for the Central Universities.