Session: Riemannian geometry, optimization, and applications
Chair: Wen Huang
Cluster: Optimization on Manifolds
Talk 1: A Riemannian Proximal Newton-CG Method
Speaker: Wen Huang
Abstract: The proximal gradient method and its variants have been generalized to Riemannian manifolds for solving optimization problems in the form of $f + g$, where $f$ is continuously differentiable and $g$ may be nonsmooth. However, most of them do not have local superlinear convergence. Recently, a Riemannian proximal Newton method has been developed for optimizing problems in this form with $\mathcal{M}$ being a compact embedded submanifold and $g(x)= \lambda \|x\|_1$. Although this method converges superlinearly locally, global convergence is not guaranteed. The existing remedy relies on a hybrid approach: running a Riemannian proximal gradient method until the iterate is sufficiently accurate and switching to the Riemannian proximal Newton method. This existing approach is sensitive to the switching parameter. In this talk, we propose a Riemannian proximal Newton-CG method that merges the truncated conjugate gradient method with the Riemannian proximal Newton method. The global convergence and local superlinear convergence are proven. Numerical experiments show that the proposed method outperforms other state-of-the-art methods.
Talk 2: Manifold Identification and Second-Order Algorithms for l1-Regularization on the Stiefel Manifold
Speaker: Shixiang Chen
Abstract: In this talk, we will discuss manifold identification for the l1-regularization problem on the Stiefel manifold. First, we will demonstrate that the intersection of the identified manifold with the Stiefel manifold forms a submanifold. Building on this, we will propose a novel second-order retraction-based algorithm specifically designed for the intersected submanifold. Numerical experiments confirm that the new algorithm exhibits superlinear convergence.
Talk 3: A Riemannian Accelerated Proximal Gradient Method
Speaker: Shuailing Feng
Abstract: Riemannian accelerated gradient methods have been widely studied for smooth problem, but whether accelerated proximal gradient methods for nonsmooth composite problem on Riemannian manifolds can achieve theoretically acceleration remains unclear. Moreover, existing Riemannian accelerated gradient methods address geodesically convex and geodesically strongly convex cases separately. In this work, we introduce a unified Riemannian accelerated proximal gradient method with a rigorous convergence rate analysis for optimization problem of the form $F(x) = f(x) + h(x)$ on manifolds, where $f$ is either geodesically convex or geodesically strongly convex, and $h$ is weakly retraction-convex. Our analysis shows that the proposed method achieves acceleration under appropriate conditions. Additionally, we introduce a safeguard mechanism to ensure the convergence of the Riemannian accelerated proximal gradient method in non-convex settings. Numerical experiments demonstrate the effectiveness and theoretical acceleration of our algorithms.