Loading…
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Session: Performance Estimation Problem and Universal Methods
Chair: Sam Davanloo
Cluster: Nonlinear Optimization

Talk 1: Performance Estimation for Smooth and Strongly Convex Sets
Speaker: Benjamin Grimmer
Abstract: This talk will present extensions of recent computer-assisted design and analysis techniques for first-order optimization over structured functions---known as performance estimation---to apply to structured sets. Namely, we prove ``interpolation theorems'' for smooth and strongly convex sets with Slater points and bounded diameter, showing a wide range of extremal questions amount to structured mathematical programs. Prior function interpolation theorems are recovered as a limit of our set interpolation theory. Our theory provides finite-dimensional formulations of performance estimation problems for algorithms utilizing separating hyperplane oracles, linear optimization oracles, and/or projection oracles of smooth/strongly convex sets. As direct applications of this computer-assisted machinery, we identify the minimax optimal separating hyperplane method and several areas for improvement in the theory of Frank-Wolfe, Alternating Projections, and non-Lipschitz Smooth Optimization.

Talk 2: Universal subgradient and proximal bundle methods for convex and strongly convex hybrid composite optimization
Speaker: Jiaming Liang
Abstract: This work develops two parameter-free methods for solving convex and strongly convex hybrid composite optimization problems, namely, a composite subgradient type method and a proximal bundle type method. Both functional and stationary complexity bounds for the two methods are established in terms of the unknown strong convexity parameter. To the best of our knowledge, the two proposed methods are the first universal methods for solving hybrid strongly convex composite optimization problems that do not rely on any restart scheme nor require the knowledge of the optimal value.

Talk 3: Nonasymptotic Analysis of Accelerated Methods With Inexact Oracle Under Absolute Error Bound
Speaker: Sam Davanloo
Abstract: Performance analysis of first-order algorithms with inexact oracle has gained recent attention due to various emerging applications in which obtaining exact gradients is impossible or computationally expensive. Previous research has demonstrated that the performance of accelerated first-order methods is more sensitive to gradient errors compared with non-accelerated ones. This work investigates the nonasymptotic convergence bound of two accelerated methods with inexact gradients to solve deterministic smooth convex problems. Performance Estimation Problem (PEP) is used as the primary tool to analyze the convergence bounds of the underlying algorithms. By finding an analytical solution to PEP, we derive novel convergence bounds for the Inexact Optimized Gradient Method (OGM) of Kim and Fessler 2016 and the Inexact Fast Gradient Method (FGM) of Devolder et al. 2013 with variable oracle inexactness along iterations. Under the absolute error assumption, we derive bounds in which the accumulated errors are independent of the initial conditions and the trajectory of the sequences generated by the algorithms. Furthermore, we analyze the tradeoff between the convergence rate and accumulated error that guides finding the optimal stepsize. Finally, we determine the optimal strategy to set the gradient inexactness along iterations (if possible), ensuring that the accumulated error remains subordinate to the convergence rate.

Speakers
BG

Benjamin Grimmer

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
JL

Jiaming Liang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
SD

Sam Davanloo

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Taper Hall (THH) 210 3501 Trousdale Pkwy, 210, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link