Loading…
Session: Derivative-free stochastic optimization methods I
Chair: Stefan Wild
Cluster: Derivative-free Optimization

Talk 1: Improving the robustness of zeroth-order optimization solvers
Speaker: Stefan Wild
Abstract: Zeroth-order optimization solvers are often deployed in settings where little information regarding a problem's conditioning or noise level is known. An ideal solver will perform well in a variety of challenging settings. We report on our experience developing adaptive algorithms, which leverage information learned online to adapt critical algorithmic features. We illustrate our approach in trust-region-based reduced-space methods and show how trained policies can even be deployed effectively in nonstationary cases, where the noise seen changes over the decision space.

Talk 2: A noise-aware stochastic trust-region algorithm using adaptive random subspaces
Speaker: Kwassi Joseph Dzahini
Abstract: We introduce ANASTAARS, a noise-aware stochastic trust-region algorithm using adaptive random subspace strategies, that is effective not only for low- and moderate-dimensional cases, but also for potentially high-dimensional ones. The proposed method achieves scalability by optimizing random models that approximate the objective within low-dimensional affine subspaces, thereby significantly reducing per-iteration costs in terms of function evaluations. These subspaces and their dimension are defined via Johnson--Lindenstrauss transforms such as those obtained from Haar-distributed orthogonal random matrices. In contrast to previous work involving random subspaces with fixed dimensions, ANASTAARS introduces an adaptive subspace selection strategy. Instead of generating a completely new poised set of interpolation points at each iteration, the proposed method updates the model by generating only a few or even a single new interpolation point, reusing past points (and their corresponding function values) from lower-dimensional subspaces in such a way that the resulting set remains poised. This approach not only introduces a novel way to reduce per-iteration costs in terms of function evaluations, but also avoids constructing random models in fixed-dimension subspaces, resulting in a more efficient and optimization process through the use of adaptive subspace models. Furthermore, to address the observation that model-based trust-region methods perform optimally when the signal-to-noise ratio is high, ANASTAARS incorporates a strategy from noise-aware numerical optimization literature by utilizing an estimate of the noise level in each function evaluation. The effectiveness of the method is demonstrated through numerical experiments conducted on problems within the "quantum approximate optimization algorithm" (QAOA) framework.

Talk 3: A consensus-based global optimization method with noisy objective function
Speaker: Greta Malaspina
Abstract: Consensus based optimization is a derivative-free particles-based method for the solution of global optimization problems. Several versions of the method have been proposed in the literature, and different convergence results have been proved, with varying assumptions on the regularity of the objective function and the initial distribution of the particles. However, all existing results assume the objective function to be evaluated exactly at each iteration of the method. In this work, we extend the convergence analysis of a discrete-time CBO method to the case where only a noisy stochastic estimator of the objective function can be computed at a given point. In particular we prove that under suitable assumptions on the oracle’s noise, the expected value of the mean squared distance of the particles from the solution can be made arbitrarily small in a finite number of iterations. Some numerical results showing the impact of noise are also given.

Speakers
SW

Stefan Wild

Lawrence Berkeley National Laboratory
Name: Dr. Slothington "Slow Convergence" McNapfaceTitle: Distinguished Professor of Continuous Optimization & Energy MinimizationAffiliation: The Lush Canopy Institute of Sluggish AlgorithmsBio:Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
KJ

Kwassi Joseph Dzahini

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
GM

Greta Malaspina

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Tuesday July 22, 2025 10:30am - 11:45am PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 256 3518 Trousdale Pkwy, 256, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link