Neural Architecture Search via Proxy Validation

IEEE Trans Pattern Anal Mach Intell. 2023 Jun;45(6):7595-7610. doi: 10.1109/TPAMI.2022.3217648. Epub 2023 May 5.

Abstract

This paper searches for the optimal neural architecture by minimizing a proxy of validation loss. Existing neural architecture search (NAS) methods used to discover the optimal neural architecture that best fits the validation examples given the up-to-date network weights. These intermediate validation results are invaluable but have not been fully explored. We propose to approximate the validation loss landscape by learning a mapping from neural architectures to their corresponding validate losses. The optimal neural architecture thus can be easily identified as the minimum of this proxy validation loss landscape. To improve the efficiency, a novel architecture sampling strategy is developed for the approximation of the proxy validation loss landscape. We also propose an operation importance weight (OIW) to balance the randomness and certainty of architecture sampling. The representation of neural architecture is learned through a graph autoencoder (GAE) over both architectures sampled during search and randomly generated architectures. We provide theoretical analyses on the validation loss estimator learned with our sampling strategy. Experimental results demonstrate that the proposed proxy validation loss landscape can be effective in both the differentiable NAS and the evolutionary-algorithm-based (EA-based) NAS.