Loading…
Monday July 21, 2025 1:15pm - 2:30pm PDT
Session: Recent progresses in derivative-free optimization I
Chair: Giampaolo Liuzzi
Cluster: Derivative-free Optimization

Talk 1: Worst-case complexity analysis of derivative-free methods for multi-objective optimization
Speaker: Giampaolo Liuzzi
Abstract: In this work, we consider unconstrained multiobjective optimization problems where objective function values can only be obtained by querying a black box. The main aim of the paper is to give worst-case complexity bounds for derivative-free multi-objective optimization methods which adopt a linesearch expansion technique. We show that the considered methods enjoy the same worst-case complexity bounds recently proved for a directional multisearch method. Furthermore, exploiting the expansion technique, we are also able to give a further complexity results concerning the number of iterations with a measure of stationarity above a prefixed tolerance.

Talk 2: Exploring complexity bounds of model based trust region derivative free methods
Speaker: Katya Scheinberg
Abstract: Model-based trust region derivative free methods pioneered by Powell rely on interpolation models to approximate objective function in a trust region. The quality of this approximation dictates algorithmic progress and is, in turn, dictated by the geometry of the sample set. The methods are designed to trade-off carefully between the "exploration" and "exploitation", i.e. between seeking progress an improving sample set geometry. While these methods have been very successful in practice and have been show to converge, their complexity analysis has been incomplete, especially in terms of dependence on the dimension. We will present an improved complexity analysis for different variants of these methods.

Talk 3: Revisiting the convergence analysis of derivative-free trust region and direct search
Speaker: Cunxin Huang
Abstract: Derivative-Free trust region and direct search are two popular classes of derivative-free optimization methods. In this paper, we propose a unified new perspective for the convergence analysis of these two classes of methods. Specifically, we find that the behavior of an algorithm-determined series will decide the asymptotic convergence, which is a generalization of the existing results under both deterministic and randomized settings.

Speakers
GL

Giampaolo Liuzzi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
KS

Katya Scheinberg

Professor, Georgia Institute of Technology
Name:Katya Scheinberg Title: Coca-Cola Foundation Chair and ProfessorAffiliation: H. Milton Stewart School of Industrial and Systems Engineering  Georgia Institute of Technology, Atlanta, GABio:Katya Scheinberg is a Coca-Cola Foundation Chair and Professor in the H. Milton Stewart... Read More →
CH

Cunxin Huang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Monday July 21, 2025 1:15pm - 2:30pm PDT
Taper Hall (THH) 114 3501 Trousdale Pkwy, 114, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link