Lower Bounds on the Noiseless Worst-Case Complexity of Efficient Global Optimization

J Optim Theory Appl. 2024;201(2):583-608. doi: 10.1007/s10957-024-02399-1. Epub 2024 Mar 11.

Abstract

Efficient global optimization is a widely used method for optimizing expensive black-box functions. In this paper, we study the worst-case oracle complexity of the efficient global optimization problem. In contrast to existing kernel-specific results, we derive a unified lower bound for the oracle complexity of efficient global optimization in terms of the metric entropy of a ball in its corresponding reproducing kernel Hilbert space. Moreover, we show that this lower bound nearly matches the upper bound attained by non-adaptive search algorithms, for the commonly used squared exponential kernel and the Matérn kernel with a large smoothness parameter ν. This matching is up to a replacement of d/2 by d and a logarithmic term logRϵ, where d is the dimension of input space, R is the upper bound for the norm of the unknown black-box function, and ϵ is the desired accuracy. That is to say, our lower bound is nearly optimal for these kernels.

Keywords: Efficient global optimization; Reproducing Kernel Hilbert space; Worst-case complexity.