Loading…
Session: Convexity and First-Order Riemannian methods
Chair: David Martínez-Rubio
Cluster: Optimization on Manifolds

Talk 1: Online Implicit Riemannian Optimization
Speaker: Christophe Roux
Abstract: Many optimization problems such as eigenvalue problems, principal component analysis and low-rank matrix completion can be interpreted as optimization problems over Riemannian manifolds, which allows for exploiting the geometric structure of the problems. While Riemannian optimization has been studied extensively in the offline setting, the /online/ setting is not well understood. A major challenge in prior works was handling in-manifold constraints that arise in the online setting. We leverage /implicit/ methods to address this problem and improve over existing regret bounds. Furthermore, using our new implicit online algorithms, we achieve accelerated rates for /constrained/ Riemannian geodesically convex-concave min-max problems, which was previously only possible under additional assumptions.

Talk 2: What do global Polyak-Łojasiewicz functions look like?
Speaker: Christopher Criscitiello
Abstract: In optimization and machine learning literature, it is common to assume a smooth function $f \colon \mathbb{R}^n \to \mathbb{R}$ globally satisfies the Polyak-Łojasiewicz (PŁ) condition, as it guarantees global exponential convergence of gradient descent. We ask: what do such functions look like? Specifically, is the set of minimizers of $f$ necessarily a smooth manifold diffeomorphic to a Euclidean space? And if so, is there a diffeomorphism of $\mathbb{R}^n$ taking $f$ to a quadratic function? We completely answer both questions. The set of minimizers of $f$ forms a smooth manifold of dimension $k \geq 0$. We show that if $k = 0,1,2$, then the answer to both questions is yes: the set of minimizers of $f$ is necessarily diffeomorphic to a point, line or plane, and there exists a diffeomorphism taking $f$ to a quadratic. If $k \geq 3$, surprisingly the answer is no: there exists a global PŁ function whose set of minimizers is not diffeomorphic to a Euclidean space. We then give a necessary and sufficient condition under which the answer to both questions is yes for $k \geq 5$. An immediate consequence of our results is that every smooth global Polyak-Łojasiewicz function is geodesically convex in some metric. Moreover, if the function has a compact set of minimizers, then it is geodesically convex in a *flat* metric. Our proofs use tools from differential topology, like local stable manifold theorems and the Morse Lemma, and our counterexamples build on the famous Whitehead manifold. This is joint work with Quentin Rebjock and Nicolas Boumal.

Talk 3: Convergence and Trade-Offs in Riemannian Gradient Descent and Riemannian Proximal Point
Speaker: David Martínez-Rubio
Abstract: We revisit the analyses of the two most fundamental algorithms in geodesically-convex optimization: Riemannian gradient descent and Riemannian proximal point. Previous analyses did not fully quantify rates of convergence under standard assumptions or were of limited generality. We show that for different step sizes the iterates naturally stay in balls of different radii around an optimizer, depending on the initial distance and the curvature. This along with different subproblems one may solve, allows to design different variants that trade-off curvature dependence for a different dependence of rates on condition numbers or subproblem types to solve. The question of whether we can get the best convergence of all of our algorithms with a single efficient algorithm remains open. We also provide new properties of proximal methods on Riemannian manifolds and an implementable inexact proximal point algorithm yielding new results on minmax geodesically convex-concave problems.

Speakers
CR

Christophe Roux

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
CC

Christopher Criscitiello

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
DM

David Martínez-Rubio

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 4:15pm - 5:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 100 3518 Trousdale Pkwy, 100, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link