Session: Algorithms for structured Riemannian optimization problems I
Chair: Jiang Hu
Cluster: Optimization on Manifolds
Talk 1: Bridging Convex and Nonconvex Approaches: Geodesic Strong Convexity of QCQPs from Tightness and Strict Complementarity of SDR
Speaker: Jinxin Wang
Abstract: Nonconvex quadratically constrained quadratic programs (QCQPs) has received increasing attention in recent decades. Two typical approaches to tackling such challenging problems are convex semidefinite relaxation (SDR) and nonconvex iterative methods (e.g., gradient descent). Although the convex SDR and nonconvex approaches have been extensively studied, they have been largely explored in parallel, and few things can be said about their connections. In this talk, we take the first step in building connections between convex and nonconvex approaches. Under the tightness and strict complementarity of SDR, we find that nonconvex QCQPs admit nice geometric properties, including quadratic growth, error bound, and even (quotient) geodesic strong convexity. These geometric properties quantify the growth behavior of optimization problems around global optimal solution sets and can imply fast convergence rates of various nonconvex iterative methods. This is a joint work with Huikang Liu, Chonghe Jiang, and Anthony Man-Cho So.
Talk 2: Rotation Group Synchronization via Quotient Manifold
Speaker: Linglingzhi Zhu
Abstract: Rotation group synchronization is a significant inverse problem and has attracted intense attention from numerous application fields such as graph realization, computer vision, and robotics. In this talk, we focus on the least squares estimator of rotation group synchronization with general additive noise, which is a nonconvex optimization problem with manifold constraints. Departing from the standard approach of utilizing the geometry of the ambient Euclidean space to study phase/orthogonal group synchronization, we adopt an intrinsic Riemannian approach to study rotation group synchronization. Benefiting from a quotient geometric view, we prove the negative definite condition of quotient Riemannian Hessian around the optimal solution of orthogonal group synchronization problem. Consequently, the Riemannian local error bound property holds and can be applied to analyze the convergence properties of various Riemannian algorithms. On the other hand, improved estimation results of the spectral and least squares estimator are derived, which guarantee the tightness of orthogonal group synchronization for solving rotation group version under certain noise level. The sequential convergence guarantee of the Riemannian (quotient) gradient method for solving orthogonal/rotation group synchronization problem is studied and we derive its linear convergence rate to the optimal solution with the spectral initialization. All results are deterministic without any probabilistic model.
Talk 3: Randomized Submanifold Subgradient Method for Optimization over Stiefel Manifolds
Speaker: Andy Cheung
Abstract: Optimization over Stiefel manifolds has found wide applications in many scientific and engineering domains. Despite considerable research effort, high-dimensional optimization problems over Stiefel manifolds remain challenging, and the situation is exacerbated by nonsmooth objective functions. The purpose of this paper is to propose and study a novel coordinate-type algorithm for weakly convex (possibly nonsmooth) optimization problems over high-dimensional Stiefel manifolds, named randomized submanifold subgradient method (RSSM). Similar to coordinate-type algorithms in the Euclidean setting, RSSM exhibits low per-iteration cost and is suitable for high-dimensional problems. We prove that RSSM converges to the set of stationary points and attains ε-stationary points with respect to a natural stationarity measure in (ε−4) iterations in both expectation and the almost-sure senses. To the best of our knowledge, these are the first convergence guarantees for coordinate-type algorithms to optimize nonconvex nonsmooth functions over Stiefel manifolds. An important technical tool in our convergence analysis is a new Riemannian subgradient inequality for weakly convex functions on proximally smooth matrix manifolds, which could be of independent interest.