Zeroth-order optimization (ZO) is widely used for solving black-box optimization and control problems. In particular, single-point ZO (SZO) is well-suited to online or dynamic problem settings due to its requirement of only a single function evaluation per iteration. However, SZO suffers from high gradient estimation variance and slow convergence, which severely limit its practical applicability. To overcome this limitation, we propose a novel yet simple SZO framework termed regression-based SZO (ReSZO), which substantially enhances the convergence rate. Specifically, ReSZO constructs a surrogate function via regression using historical function evaluations and employs the gradient of this surrogate function for iterative updates. Two instantiations of ReSZO, which fit linear and quadratic surrogate functions respectively, are introduced. Moreover, we provide a non-asymptotic convergence analysis for the linear instantiation of ReSZO, showing that its convergence rates are comparable to those of two-point ZO methods. Extensive numerical experiments demonstrate that ReSZO empirically converges two to three times faster than two-point ZO in terms of function query complexity.
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
Unknown venue
- Publication date
2025-07-06
- Fields of study
Mathematics, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-26 of 26 references · Page 1 of 1
CITED BY
Showing 1-1 of 1 citing papers · Page 1 of 1