Loading…
Session: The Role of Constraints in the Computational Complexity and Solvability of Optimization Problems
Chair: Tatjana Chavdarova
Cluster: Fixed Points and Variational Inequalities

Talk 1: On the Role of Constraints in the Complexity of Min-Max Optimization
Speaker: Gabriele Farina
Abstract: We investigate the role of constraints in the computational complexity of min-max optimization. The work of Daskalakis, Skoulakis, and Zampetakis [2021] was the first to study min-max optimization through the lens of computational complexity, showing that min-max problems with nonconvex-nonconcave objectives are PPAD-hard. However, their proof hinges on the presence of joint constraints between the maximizing and minimizing players. The main goal of this paper is to understand the role of these constraints in min-max optimization. The first contribution of this paper is a fundamentally new proof of their main result, which improves it in multiple directions: it holds for degree 2 polynomials, it is essentially tight in the parameters, and it is much simpler than previous approaches, clearly highlighting the role of constraints in the hardness of the problem. Second, we show that with general constraints (i.e., the min player and max player have different constraints), even convex-concave min-max optimization becomes PPAD-hard. Along the way, we also provide PPAD-membership of a general problem related to quasi-variational inequalities, which has applications beyond our problem.

Talk 2: A First Order Primal-Dual Method for Solving Constrained Variational Inequalities
Speaker: Tatjana Chavdarova
Abstract: We introduce an interior-point method to solve constrained variational inequality (cVI) problems. By combining primal-dual and interior-point techniques, we develop a family of first-order methods called the ADMM-based interior-point method for constrained VIs (ACVI). This approach enables the solution of cVIs with complex constraints. We establish convergence guarantees for ACVI under the assumption that the operator is monotone (not necessarily L-Lipschitz) and that its subproblems can be solved exactly. Specifically, we achieve a convergence rate of O(1/sqrt(K)) for the gap function of the last iterate, matching the known lower bound. To address cases where subproblems cannot be solved analytically, we propose an inexact version of ACVI, which leverages a warm-starting technique for the subproblems, taking advantage of the fact that these subproblems change only slightly between iterations. When the operator is monotone, L-Lipschitz, and the errors in solving the subproblems decrease at appropriate rates, we demonstrate that the same convergence rate holds for the gap function. To the best of our knowledge, this is the first first-order interior-point method for general cVI problems with a global convergence guarantee.

Talk 3: The Computational Complexity of Finding Stationary Points in Non-convex Optimization
Speaker: Manolis Zampetakis
Abstract: Finding approximate stationary points, i.e., points where the gradient is approximately zero, of non-convex but smooth objective functions over unrestricted d-dimensional domains is one of the most fundamental problems in classical non-convex optimization. Nevertheless, the computational and query complexity of this problem are still not well understood when the dimension d of the problem is independent of the approximation error. In this paper, we show the following computational and query complexity results: 1. The problem of finding approximate stationary points over unrestricted domains is PLS-complete. 2. For d = 2, we provide a zero-order algorithm at a lower bound that shows that this algorithm achieves the optimal query complexity in both the constrained and the unconstrained setting. 3. Combining our results with recent results in complexity theory we show that finding approximate KKT points in constrained optimization is reducible to finding approximate stationary points in unconstrained optimization but the converse is impossible.

Speakers
GF

Gabriele Farina

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
TC

Tatjana Chavdarova

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
MZ

Manolis Zampetakis

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Tuesday July 22, 2025 4:15pm - 5:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 256 3518 Trousdale Pkwy, 256, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link