Parameterized Convex Universal Approximators for Decision-Making Problems

IEEE Trans Neural Netw Learn Syst. 2024 Feb;35(2):2448-2459. doi: 10.1109/TNNLS.2022.3190198. Epub 2024 Feb 5.

Abstract

Parameterized max-affine (PMA) and parameterized log-sum-exp (PLSE) networks are proposed for general decision-making problems. The proposed approximators generalize existing convex approximators, namely max-affine (MA) and log-sum-exp (LSE) networks, by considering function arguments of condition and decision variables and replacing the network parameters of MA and LSE networks with continuous functions with respect to the condition variable. The universal approximation theorem (UAT) of PMA and PLSE is proved, which implies that PMA and PLSE are shape-preserving universal approximators for parameterized convex continuous functions. Practical guidelines for incorporating deep neural networks within PMA and PLSE networks are provided. A numerical simulation is performed to demonstrate the performance of the proposed approximators. The simulation results support that PLSE outperforms other existing approximators in terms of a minimizer and optimal value errors with scalable and efficient computation for high-dimensional cases.