Exploiting Operation Importance for Differentiable Neural Architecture Search

IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6235-6248. doi: 10.1109/TNNLS.2021.3072950. Epub 2022 Oct 27.

Abstract

Recently, differentiable neural architecture search (NAS) methods have made significant progress in reducing the computational costs of NASs. Existing methods search for the best architecture by choosing candidate operations with higher architecture weights. However, architecture weights cannot accurately reflect the importance of each operation, that is, the operation with the highest weight might not be related to the best performance. To circumvent this deficiency, we propose a novel indicator that can fully represent the operation importance and, thus, serve as an effective metric to guide the model search. Based on this indicator, we further develop a NAS scheme for "exploiting operation importance for effective NAS" (EoiNAS). More precisely, we propose a high-order Markov chain-based strategy to slim the search space to further improve search efficiency and accuracy. To evaluate the effectiveness of the proposed EoiNAS, we applied our method to two tasks: image classification and semantic segmentation. Extensive experiments on both tasks provided strong evidence that our method is capable of discovering high-performance architectures while guaranteeing the requisite efficiency during searching.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Markov Chains
  • Neural Networks, Computer*