Loading…
Thursday July 24, 2025 1:15pm - 2:30pm PDT
Session: Nonsmooth Dynamical Systems
Chair: Emilio Vilches
Cluster: Nonsmooth Optimization

Talk 1: Understanding accelerated gradient methods through high-order ODEs
Speaker: Samir Adly
Abstract: We study convex optimization problems with smooth objectives by looking at how damped inertial dynamics, driven by the gradient, can be discretized in time. This leads to three well-known accelerated algorithms: Nesterov’s method (NAG), Ravine Accelerated Gradient (RAG), and the IGAHD method introduced by Attouch, Chbani, Fadili, and Riahi. The IGAHD method uses Hessian-driven damping to reduce oscillations that often appear in inertial methods. By analyzing continuous-time models (ODEs) of these algorithms at different levels of resolution (orders $p=0,1,2$), we gain a better understanding of their behavior. All three methods share the same low-resolution model (order 0), which corresponds to the Su–Boyd–Candès ODE for NAG. However, when we go to higher resolution (order 2), we show that NAG and RAG follow different dynamics. This is a new result and shows that NAG and RAG should not be considered equivalent. We also present numerical experiments. In terms of number of iterations, IGAHD performs best. RAG is slightly better than NAG on average. When looking at CPU-time, NAG and RAG are faster than IGAHD. All three methods show similar results when comparing gradient norms.

Talk 2: A Newton-Like Dynamical System for Nonsmooth and Nonconvex Optimization
Speaker: Juan Guillermo Garrido
Abstract: This work investigates a dynamical system functioning as a nonsmooth adaptation of the continuous Newton method, aimed at minimizing the sum of a primal lower-regular and a locally Lipschitz function, both potentially nonsmooth. The classical Newton method’s second-order information is extended by incorporating the graphical derivative of a locally Lipschitz mapping. Specifically, we analyze the existence and uniqueness of solutions, along with the asymptotic behavior of the system's trajectories. Conditions for convergence and respective convergence rates are established under two distinct scenarios: strong metric subregularity and satisfaction of the Kurdyka-Łojasiewicz inequality.

Talk 3: (Cancelled)
Speaker: Emilio Vilches
Abstract: Cancelled

Speakers
JG

Juan Guillermo Garrido

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
EV

Emilio Vilches

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Thursday July 24, 2025 1:15pm - 2:30pm PDT
Taper Hall (THH) 110 3501 Trousdale Pkwy, 110, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link