A Gradient-Guided Evolutionary Neural Architecture Search

IEEE Trans Neural Netw Learn Syst. 2024 Mar 11:PP. doi: 10.1109/TNNLS.2024.3371432. Online ahead of print.

Abstract

Neural architecture search (NAS) is a popular method that can automatically design deep neural network structures. However, designing a neural network using NAS is computationally expensive. This article proposes a gradient-guided evolutionary NAS (GENAS) to design convolutional neural networks (CNNs) for image classification. GENAS is a hybrid algorithm that combines evolutionary global and local search operators to evolve a population of subnets sampled from a supernet. Each candidate architecture is encoded as a table describing which operations are associated with the edges between nodes signifying feature maps. Besides, evolutionary optimization uses novel crossover and mutation operators to manipulate the subnets using the proposed tabular encoding. Every n generations, the candidate architectures undergo a local search inspired by differentiable NAS. GENAS is designed to overcome the limitations of both evolutionary and gradient descent NAS. This algorithmic structure enables the performance assessment of the candidate architecture without retraining, thus limiting the NAS calculation time. Furthermore, subnet individuals are decoupled during evaluation to prevent strong coupling of operations in the supernet. The experimental results indicate that the searched structures achieve test errors of 2.45%, 16.86%, and 23.9% on CIFAR-10/100/ImageNet datasets and it costs only 0.26 GPU days on a graphic card. GENAS can effectively expedite the training and evaluation processes and obtain high-performance network structures.