Loading…
Thursday July 24, 2025 10:30am - 11:45am PDT
Session: Nonconvex methods and SDPs
Chair: Florentin Goyens
Cluster: Nonlinear Optimization

Talk 1: Higher-Order Newton Methods with Polynomial Work per Iteration
Speaker: Amir Ali Ahmadi
Abstract: We present generalizations of Newton’s method that incorporate derivatives of an arbitrary order $d$ but maintain a polynomial dependence on dimension in their cost per iteration. At each step, our $d$th-order method uses semidefinite programming to construct and minimize a sum of squares-convex approximation to the $d$th-order Taylor expansion of the function we wish to minimize. We prove that our $d$th-order method has local convergence of order $d$. This results in lower oracle complexity compared to the classical Newton method. We show on numerical examples that basins of attraction around local minima can get larger as $d$ increases. Under additional assumptions, we present a modified algorithm, again with polynomial cost per iteration, which is globally convergent and has local convergence of order $d$. Co-authors: Abraar Chaudhry and Jeffrey Zhang.

Talk 2: Sharp Global Guarantees for Nonconvex Low-rank Recovery in the Noisy Overparameterized Regime
Speaker: Richard Zhang
Abstract: Rank overparameterization was recently shown to make nonconvex low-rank matrix recovery increasingly benign under the restricted isometry property (RIP), by converting all local minima into global minima that exactly recover the ground truth. But this does not fully explain the practical success of overparameterization, because real algorithms can still become trapped at nonstrict saddle points (approximate second-order points with arbitrarily small negative curvature) even when all local minima are global. Moreover, the result does not accommodate for noisy measurements, but it is unclear whether such an extension is even possible, in view of the many discontinuous and unintuitive behaviors already known for the overparameterized regime. In this paper, we introduce a novel proof technique that unies, simplifies, and strengthens two previously competing approachesone based on escape directions and the other based on the inexistence of counterexampleto provide sharp global guarantees in the noisy overparameterized regime. We show, once local minima have been converted into global minima through slight overparameterization, that near-second-order points achieve the same minimax-optimal recovery bounds (up to small constant factors) as significantly more expensive convex approaches. Our results are sharp with respect to the noise level and the solution accuracy, and hold for both the symmetric parameterization $XX^T$, as well as the asymmetric parameterization $UV^T$ under a balancing regularizer; we demonstrate that the balancing regularizer is indeed necessary.

Talk 3: HALLaR: A New Solver for Solving Huge SDPs
Speaker: Arnesh Sujanani
Abstract: This talk presents a new first-order method for solving large-scale semidefinite programs (SDPs) with bounded domain. It is an inexact augmented Lagrangian (AL) method where the AL subproblems are solved by a hybrid low-rank method. In contrast to the classical low-rank method, the new method finds a near-optimal solution (with provable complexity bounds) of SDP instances satisfying strong duality. Computational results comparing the new method to state-of-the-art solvers on several large SDP instances show that the former finds higher accurate solutions in substantially less CPU time than the latter ones. For example, in less than 20 minutes, our method can solve (on a personal laptop) a maximum stable set SDP with 1 million vertices and 10 million edges within 1e-5 relative accuracy. This is joint work with Renato D.C. Monteiro and Diego Cifuentes.

Speakers
avatar for Florentin Goyens

Florentin Goyens

Postdoc, UCLouvain
Researcher in numerical optimization at UCLouvain in Belgium
AA

Amir Ali Ahmadi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
RZ

Richard Zhang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
AS

Arnesh Sujanani

Postdoctoral Fellow
Name: Dr. Slothington "Slow Convergence" McNapfaceTitle: Distinguished Professor of Continuous Optimization & Energy MinimizationAffiliation: The Lush Canopy Institute of Sluggish AlgorithmsBio:Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Thursday July 24, 2025 10:30am - 11:45am PDT
Taper Hall (THH) 118 3501 Trousdale Pkwy, 118, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link