An automated ICU agitation monitoring system for video streaming using deep learning classification

BMC Med Inform Decis Mak. 2024 Mar 18;24(1):77. doi: 10.1186/s12911-024-02479-2.

Abstract

Objective: To address the challenge of assessing sedation status in critically ill patients in the intensive care unit (ICU), we aimed to develop a non-contact automatic classifier of agitation using artificial intelligence and deep learning.

Methods: We collected the video recordings of ICU patients and cut them into 30-second (30-s) and 2-second (2-s) segments. All of the segments were annotated with the status of agitation as "Attention" and "Non-attention". After transforming the video segments into movement quantification, we constructed the models of agitation classifiers with Threshold, Random Forest, and LSTM and evaluated their performances.

Results: The video recording segmentation yielded 427 30-s and 6405 2-s segments from 61 patients for model construction. The LSTM model achieved remarkable accuracy (ACC 0.92, AUC 0.91), outperforming other methods.

Conclusion: Our study proposes an advanced monitoring system combining LSTM and image processing to ensure mild patient sedation in ICU care. LSTM proves to be the optimal choice for accurate monitoring. Future efforts should prioritize expanding data collection and enhancing system integration for practical application.

Keywords: Deep learning; ICU; Motion detection; RASS; Video streaming data.

MeSH terms

  • Artificial Intelligence
  • Critical Care
  • Deep Learning*
  • Humans
  • Intensive Care Units
  • Psychomotor Agitation* / diagnosis