Enhanced Gradient for Differentiable Architecture Search

IEEE Trans Neural Netw Learn Syst. 2023 Jan 17:PP. doi: 10.1109/TNNLS.2023.3235479. Online ahead of print.

Abstract

In recent years, neural architecture search (NAS) methods have been proposed for the automatic generation of task-oriented network architecture in image classification. However, the architectures obtained by existing NAS approaches are optimized only for classification performance and do not adapt to devices with limited computational resources. To address this challenge, we propose a neural network architecture search algorithm aiming to simultaneously improve the network performance and reduce the network complexity. The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search. At the stage of block-level search, a gradient-based relaxation method is proposed, using an enhanced gradient to design high-performance and low-complexity blocks. At the stage of network-level search, an evolutionary multiobjective algorithm is utilized to complete the automatic design from blocks to the target network. The experimental results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification, with an error rate of 3.18% on Canadian Institute for Advanced Research (CIFAR10) and an error rate of 19.16% on CIFAR100, both at network parameter size less than 1 M. Obviously, compared with other NAS methods, our method offers a tremendous reduction in designed network architecture parameters.