Split-Level Evolutionary Neural Architecture Search With Elite Weight Inheritance

IEEE Trans Neural Netw Learn Syst. 2023 May 24:PP. doi: 10.1109/TNNLS.2023.3269816. Online ahead of print.

Abstract

Neural architecture search (NAS) has recently gained extensive interest in the deep learning community because of its great potential in automating the construction process of deep models. Among a variety of NAS approaches, evolutionary computation (EC) plays a pivotal role with its merit of gradient-free search ability. However, a massive number of the current EC-based NAS approaches evolve neural architectures in an absolutely discrete manner, which makes it tough to flexibly handle the number of filters for each layer, since they often reduce it to a limit set rather than searching for all possible values. Moreover, EC-based NAS methods are often criticized for their inefficiency in performance evaluation, which usually requires laborious full training for hundreds of candidate architectures generated. To address the inflexible search issue on the number of filters, this work proposes a split-level particle swarm optimization (PSO) approach. Each dimension of the particle is subdivided into an integer part and a fractional part, encoding the configurations of the corresponding layer, and the number of filters within a large range, respectively. In addition, the evaluation time is greatly saved by a novel elite weight inheritance method based on an online updating weight pool, and a customized fitness function considering multiple objectives is developed to well control the complexity of the searched candidate architectures. The proposed method, termed split-level evolutionary NAS (SLE-NAS), is computationally efficient, and outperforms many state-of-the-art peer competitors at much lower complexity across three popular image classification benchmark datasets.