Loading…
Session: Algorithms for structured Riemannian optimization problems I
Chair: Jiang Hu
Cluster: Optimization on Manifolds

Talk 1: Bridging Convex and Nonconvex Approaches: Geodesic Strong Convexity of QCQPs from Tightness and Strict Complementarity of SDR
Speaker: Jinxin Wang
Abstract: Nonconvex quadratically constrained quadratic programs (QCQPs) has received increasing attention in recent decades. Two typical approaches to tackling such challenging problems are convex semidefinite relaxation (SDR) and nonconvex iterative methods (e.g., gradient descent). Although the convex SDR and nonconvex approaches have been extensively studied, they have been largely explored in parallel, and few things can be said about their connections. In this talk, we take the first step in building connections between convex and nonconvex approaches. Under the tightness and strict complementarity of SDR, we find that nonconvex QCQPs admit nice geometric properties, including quadratic growth, error bound, and even (quotient) geodesic strong convexity. These geometric properties quantify the growth behavior of optimization problems around global optimal solution sets and can imply fast convergence rates of various nonconvex iterative methods. This is a joint work with Huikang Liu, Chonghe Jiang, and Anthony Man-Cho So.

Talk 2: Rotation Group Synchronization via Quotient Manifold
Speaker: Linglingzhi Zhu
Abstract: Rotation group synchronization is a significant inverse problem and has attracted intense attention from numerous application fields such as graph realization, computer vision, and robotics. In this talk, we focus on the least squares estimator of rotation group synchronization with general additive noise, which is a nonconvex optimization problem with manifold constraints. Departing from the standard approach of utilizing the geometry of the ambient Euclidean space to study phase/orthogonal group synchronization, we adopt an intrinsic Riemannian approach to study rotation group synchronization. Benefiting from a quotient geometric view, we prove the negative definite condition of quotient Riemannian Hessian around the optimal solution of orthogonal group synchronization problem. Consequently, the Riemannian local error bound property holds and can be applied to analyze the convergence properties of various Riemannian algorithms. On the other hand, improved estimation results of the spectral and least squares estimator are derived, which guarantee the tightness of orthogonal group synchronization for solving rotation group version under certain noise level. The sequential convergence guarantee of the Riemannian (quotient) gradient method for solving orthogonal/rotation group synchronization problem is studied and we derive its linear convergence rate to the optimal solution with the spectral initialization. All results are deterministic without any probabilistic model.

Talk 3: Randomized Submanifold Subgradient Method for Optimization over Stiefel Manifolds
Speaker: Andy Cheung
Abstract: Optimization over Stiefel manifolds has found wide applications in many scientific and engineering domains. Despite considerable research effort, high-dimensional optimization problems over Stiefel manifolds remain challenging, and the situation is exacerbated by nonsmooth objective functions. The purpose of this paper is to propose and study a novel coordinate-type algorithm for weakly convex (possibly nonsmooth) optimization problems over high-dimensional Stiefel manifolds, named randomized submanifold subgradient method (RSSM). Similar to coordinate-type algorithms in the Euclidean setting, RSSM exhibits low per-iteration cost and is suitable for high-dimensional problems. We prove that RSSM converges to the set of stationary points and attains ε-stationary points with respect to a natural stationarity measure in (ε−4) iterations in both expectation and the almost-sure senses. To the best of our knowledge, these are the first convergence guarantees for coordinate-type algorithms to optimize nonconvex nonsmooth functions over Stiefel manifolds. An important technical tool in our convergence analysis is a new Riemannian subgradient inequality for weakly convex functions on proximally smooth matrix manifolds, which could be of independent interest.

Speakers
JH

Jiang Hu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
avatar for Jinxin Wang

Jinxin Wang

Postdoc, University of Chicago
Name: Dr. Jinxin WangTitle: Bridging Convex and Nonconvex Approaches: Geodesic Strong Convexity of QCQPs from Tightness and Strict Complementarity of SDRAffiliation: University of Chicago Booth School of BusinessBio:Dr. Jinxin Wang works in nonconvex optimization (e.g., QCQPs, manifold... Read More →
avatar for Linglingzhi Zhu

Linglingzhi Zhu

Postdoctoral Fellow, Georgia Institute of Technology
Name: Dr. Linglingzhi ZhuTitle: Postdoctoral FellowAffiliation: Georgia Institute of TechnologyBio:Dr. Linglingzhi Zhu's research is grounded in the field of mathematical optimization and its interplay with variational analysis, Riemannian geometry, and high-dimensional statistical... Read More →
AC

Andy Cheung

Instructor, The Hong Kong Polytechnic University
I am currently an instructor in the Department of Applied Mathematics at The Hong Kong Polytechnic University. Alongside my teaching role, I pursued a part-time Ph.D. at The Chinese University of Hong Kong from 2017 to 2025, in the Department of Systems Engineering and Engineering... Read More →
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 154 3518 Trousdale Pkwy, 154, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link