Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired

Sensors (Basel). 2022 Apr 26;22(9):3307. doi: 10.3390/s22093307.

Abstract

The growing aging population suffers from high levels of vision and cognitive impairment, often resulting in a loss of independence. Such individuals must perform crucial everyday tasks such as cooking and heating with systems and devices designed for visually unimpaired individuals, which do not take into account the needs of persons with visual and cognitive impairment. Thus, the visually impaired persons using them run risks related to smoke and fire. In this paper, we propose a vision-based fire detection and notification system using smart glasses and deep learning models for blind and visually impaired (BVI) people. The system enables early detection of fires in indoor environments. To perform real-time fire detection and notification, the proposed system uses image brightness and a new convolutional neural network employing an improved YOLOv4 model with a convolutional block attention module. The h-swish activation function is used to reduce the running time and increase the robustness of YOLOv4. We adapt our previously developed smart glasses system to capture images and inform BVI people about fires and other surrounding objects through auditory messages. We create a large fire image dataset with indoor fire scenes to accurately detect fires. Furthermore, we develop an object mapping approach to provide BVI people with complete information about surrounding objects and to differentiate between hazardous and nonhazardous fires. The proposed system shows an improvement over other well-known approaches in all fire detection metrics such as precision, recall, and average precision.

Keywords: CNN; assistive technologies; blind and visually impaired; deep learning; fire detection; object detection; smart glasses.

MeSH terms

  • Aged
  • Fires*
  • Humans
  • Neural Networks, Computer
  • Visually Impaired Persons*