AANet: Adaptive Attention Network for COVID-19 Detection From Chest X-Ray Images

IEEE Trans Neural Netw Learn Syst. 2021 Nov;32(11):4781-4792. doi: 10.1109/TNNLS.2021.3114747. Epub 2021 Oct 27.

Abstract

Accurate and rapid diagnosis of COVID-19 using chest X-ray (CXR) plays an important role in large-scale screening and epidemic prevention. Unfortunately, identifying COVID-19 from the CXR images is challenging as its radiographic features have a variety of complex appearances, such as widespread ground-glass opacities and diffuse reticular-nodular opacities. To solve this problem, we propose an adaptive attention network (AANet), which can adaptively extract the characteristic radiographic findings of COVID-19 from the infected regions with various scales and appearances. It contains two main components: an adaptive deformable ResNet and an attention-based encoder. First, the adaptive deformable ResNet, which adaptively adjusts the receptive fields to learn feature representations according to the shape and scale of infected regions, is designed to handle the diversity of COVID-19 radiographic features. Then, the attention-based encoder is developed to model nonlocal interactions by self-attention mechanism, which learns rich context information to detect the lesion regions with complex shapes. Extensive experiments on several public datasets show that the proposed AANet outperforms state-of-the-art methods.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • COVID-19 / diagnostic imaging*
  • COVID-19 / epidemiology
  • Databases, Factual / standards
  • Humans
  • Neural Networks, Computer*
  • Tomography, X-Ray Computed / classification*
  • Tomography, X-Ray Computed / methods
  • Tomography, X-Ray Computed / standards*
  • X-Rays