Loading…
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Session: Derivative-free optimization for special classes of problems II
Chair: Ana Luísa Custódio
Cluster: Derivative-free Optimization

Talk 1: A direct multisearch approach for many-objective derivative-free optimization
Speaker: Ana Luísa Custódio
Abstract: From a theoretical point of view, Direct Multisearch (DMS) was developed for continuous constrained multiobjective derivative-free optimization, with a general number of components in the objective function. However, the algorithmic performance was never assessed in problems with more than three objectives. In this work, we propose DMS-Reduction, a variant of DMS based on reduction approaches, using correlation and sketching techniques. This approach is an attempt to tackle larger problems, in what respects the number of components of the objective function and the number of variables. At each iteration, the reduction in the number of components of the objective function to be optimized has the possible additional benefit of conducting to a reduction in the number of variables to be considered, since there could be the case that not all variables are related to the objective function components selected. We will detail the proposed algorithmic structure and report promising numerical results in addressing many-objective optimization problems.

Talk 2: Optimal zeroth-order methods for bilevel optimization
Speaker: Saeed Ghadimi
Abstract: In this talk, we present fully zeroth-order stochastic approximation algorithms for solving stochastic bilevel optimization problems assuming that neither the upper/lower loss functions, nor their unbiased gradient estimates are available. To do so, we first generalize the Gaussian convolution technique to the functions with two block variables and establish all corresponding relationships between such functions and their smoothed Gaussian approximations. By using these results, we estimate the first- and second-order derivatives of the objective functions and provide a fully zeroth-order double-loop algorithm whose sample complexity is optimal in terms of dependence on the target accuracy while polynomially dependent on the problem dimension. Furthermore, by using recent developments in designing fully first-order methods for bilevel optimization, we provide our second fully zeroth-order bilevel optimization algorithm whose sample complexity is optimal in terms of both the target accuracy and the problem dimension.

Talk 3: High-dimensional DFO: Stochastic subspace descent and improvements
Speaker: Stephen Becker
Abstract: We describe and analyze a family of algorithms which we call "stochastic subspace descent" which use projections of the gradient onto random subspaces, in a slightly similar spirit to well-known work by Nesterov and Spokoiny. We explain the benefits of subspace projection compared to Gaussian directional derivatives. We present a complete convergence analysis, and show that the method is well suited for high-dimensional problems. We also focus on our very recent work for cheap and automatic stepsize selection, as well as some preliminary results on biased sampling which requires leaving the "projected" paradigm.

Speakers
AL

Ana Luísa Custódio

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
SG

Saeed Ghadimi

University of Waterloo
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Taper Hall (THH) 119 3501 Trousdale Pkwy, 119, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link