Session: Recent advances in algorithms for large-scale optimization (II)
Chair: Xudong Li
Cluster: Computational Software
Talk 1: Alternating minimization for distributionally robust principal component pursuit
Speaker: Xudong Li
Abstract: Recently, the square root principal component pursuit (SRPCP) model has garnered significant research interest. It is shown in the literature that the SRPCP model guarantees robust matrix recovery with a universal, constant penalty parameter. While its statistical advantages are well-studied, the computational aspects remain unexplored. In this talk, we focus on developing efficient algorithms for solving the SRPCP problem. Specifically, we propose a tuning-free alternating minimization (AM) algorithm, where each iteration involves subproblems with semi-closed updating rules. Additionally, we introduce techniques based on the variational formulation of the nuclear norm and BM decomposition to further accelerate the AM method. Extensive numerical experiments confirm the efficiency and robustness of our algorithms.
Talk 2: A new dual semismooth Newton method for polyhedral projections
Speaker: Chao Ding
Abstract: We propose a dual semismooth Newton method for computing the orthogonal projection onto a given polyhedron, a fundamental optimization problem that serves as a critical building block for numerous important applications, e.g., financial risk management, statistics, and machine learning. Classical semismooth Newton methods typically depend on subgradient regularity assumptions for achieving local superlinear or quadratic convergence. Our approach, however, marks a significant breakthrough by demonstrating that it is always possible to identify a point where the existence of a nonsingular generalized Jacobian is guaranteed, regardless of any regularity conditions. Furthermore, we explain this phenomenon and its relationship with the weak strict Robinson constraint qualification (W-SRCQ) from the perspective of variational analysis. Building on this theoretical advancement, we develop an inexact semismooth Newton method with superlinear convergence for solving the polyhedral projection problem.
Talk 3: A Proximal DC Algorithm for Sample Average Approximation of Chance Constrained Programming
Speaker: Rujun Jiang
Abstract: Chance constrained programming (CCP) refers to a type of optimization problem with uncertain constraints that are satisfied with at least a prescribed probability level. In this work, we study the sample average approximation (SAA) method for chance constraints, which is an important approach to CCP in the data-driven setting where only a sample of multiple realizations of the random vector in the constraints is available. The SAA method approximates the underlying distribution with an empirical distribution over the available sample. Assuming that the functions in the chance constraints are all convex, we reformulate the SAA of chance constraints into a difference-of-convex (DC) form. Additionally, by assuming the objective function is also a DC function, we obtain a DC constrained DC program. To solve this reformulation, we propose a proximal DC algorithm and show that the subproblems of the algorithm are suitable for off-the-shelf solvers in some scenarios. Moreover, we not only prove the subsequential and sequential convergence of the proposed algorithm but also derive the iteration complexity for finding an approximate Karush-Kuhn-Tucker point. To support and complement our theoretical development, we show via numerical experiments that our proposed approach is competitive with a host of existing approaches.