Loading…
Thursday July 24, 2025 4:15pm - 5:30pm PDT
Session: Optimization on Manifolds and Geometric Approaches
Chair: Ian McPherson
Cluster: nan

Talk 1: Convergence Rates for Riemannian Proximal Bundle Methods
Speaker: Ian McPherson
Abstract: We propose a novel class of Riemannian proximal bundle methods for optimization on Hadamard manifolds, providing convergence rates for this new approach. Our assumptions are weak and we are able to relax the reliance on exponential maps and parallel transports, requiring only first-order retractions and vector transports. To our knowledge, these are the first non-asymptotic convergence rates for Riemannian proximal bundle methods, extending arguments that achieve optimal rates in the Euclidean case. Moreover, we show given exponential maps and parallel transports we recover the exact same rates as the Euclidean setting.

Talk 2: Incremental minimization in nonpositively curved geodesic spaces
Speaker: Ariel Goodwin
Abstract: Subgradient methods for minimizing geodesically convex functions on Hadamard manifolds have gained interest in recent years, but fall short in two respects: their complexity relies unavoidably on a lower curvature bound for the space, and they do not generalize well to metric spaces in the absence of local linearity. Complete geodesic metric spaces of nonpositive curvature, called Hadamard spaces, prove useful in modelling many applications and have a rich geometric structure enabling theoretical and computational aspects of convex optimization. It has recently been shown that a restricted class of functions on Hadamard spaces can be effectively minimized using an iteration resembling a subgradient method, with the same complexity result as the classical Euclidean subgradient method. In this work we propose a related class of functions which we call Busemann convex, admitting a notion of subgradient that is attuned to the geometry of the space. Many functions defined in terms of basic metric quantities are Busemann convex, and their subgradients are readily computed in terms of geodesics. We address the minimization of sums of Busemann convex functions with an incremental subgradient-style method and associated complexity result. To illustrate the algorithm, we numerically compute medians of trees in the BHV phylogenetic tree space. This is joint work with Adrian Lewis, Genaro López-Acedo, and Adriana Nicolae. A preprint is available at https://arxiv.org/abs/2412.06730.

Talk 3: Interior Riemannian subgradient flow over manifold with boundary
Speaker: Kuangyu Ding
Abstract: We study a nonsmooth nonconvex optimization problem defined over a manifold with boundary, where the feasible set is given by the intersection of the closure of an open set and a smooth manifold. By endowing the open set with a Riemannian metric induced by a barrier function, we obtain a Riemannian subgradient flow—formulated as a differential inclusion—that remains strictly within the interior of the feasible set. This continuous dynamical system unifies two classes of iterative optimization methods, namely the Hessian barrier method and Bregman-type methods, by revealing that these methods can be interpreted as discrete approximations of the continuous flow. We analyze the long-term behavior of the trajectories generated by this dynamical system and show that many properties of the Hessian barrier and Bregman-type methods can be more insightfully understood through these of the continuous trajectory. For instance, the notorious spurious stationary points observed in Bregman-type methods are interpreted as stable equilibria of the dynamical system that do not correspond to true stationary points of the original problem. We prove that these spurious stationary points can be avoided if the strict complementarity condition holds. In the absence of this regularity condition, we propose a random perturbation strategy that ensures the trajectory converges (subsequentially) to an approximate stationary point. Building on these insights, we introduce an iterative Riemannian subgradient method—a form of interior point approach—that generalizes the existing Hessian barrier method for solving nonsmooth nonconvex optimization problems.

Speakers
IM

Ian McPherson

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
AG

Ariel Goodwin

Cornell University Center for Applied Mathematics, Ph.D. Student.
KD

Kuangyu Ding

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Thursday July 24, 2025 4:15pm - 5:30pm PDT
Taper Hall (THH) 201 3501 Trousdale Pkwy, 201, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link