Session: Convex approaches to SDP
Chair: Richard Zhang
Cluster: Conic and Semidefinite Optimization
Talk 1: Semidefinite Relaxations for the Gromov-Wasserstein Problem
Speaker: Yong Sheng Soh
Abstract: The Gromov-Wasserstein (GW) problem is an extension of the classical optimal transport problem to settings where the source and target distributions reside in incomparable spaces, and for which a cost function that attributes the price of moving resources is not available. The GW problem is a non-convex quadratic program and is intractable to solve in general. The sum-of-squares (SOS) hierarchy is a principled approach for deriving tractable semidefinite relaxations to generic polynomial optimization problems. In this talk, we describe a moment-SOS hierarchy for solving the GW problem. We discuss convergence rates as well as sample complexity rates that arise from sampling. We show that these hierarchies define pseudo-metrics over the space of (metric measure-) spaces. Last, we describe extensions of these relaxations to continuous measures. In particular, our work hints at applying semidefinite relaxations for general optimization problems over distributions that depend on the decision variable in a polynomial way.
Talk 2: Iterative methods for primal-dual scalings in conic optimization
Speaker: Lieven Vandenberghe
Abstract: A central element in the design of primal-dual interior-point methods for conic optimization is the definition of a suitable primal-dual scaling or metric. The talk will discuss simple iterative methods for computing primal-dual scalings. We will consider the Nesterov-Todd scaling for symmetric cones and extensions to sparse positive semidefinite matrix cones.
Talk 3: Generalized Cuts and Grothendieck Covers: a Primal-Dual Approximation Framework Extending the Goemans--Williamson Algorithm
Speaker: Nathan Benedetto Proenca
Abstract: We provide a primal-dual framework for randomized approximation algorithms utilizing semidefinite programming (SDP) relaxations. Our framework pairs a continuum of APX-complete problems including MaxCut, Max2Sat, MaxDicut, and more generally, Max-Boolean Constraint Satisfaction and MaxQ (maximization of a positive semidefinite quadratic form over the hypercube) with new APX-complete problems which are stated as convex optimization problems with exponentially many variables. These new dual counterparts, based on what we call Grothendieck covers, range from fractional cut covering problems (for MaxCut) to tensor sign covering problems (for MaxQ). For each of these problem pairs, our framework transforms the randomized approximation algorithms with the best known approximation factors for the primal problems to randomized approximation algorithms for their dual counterparts with reciprocal approximation factors which are tight with respect to the Unique Games Conjecture. For each APX-complete pair, our algorithms solve a single SDP relaxation and generate feasible solutions for both problems which also provide approximate optimality certificates for each other. Our work utilizes techniques from areas of randomized approximation algorithms, convex optimization, spectral sparsification, as well as Chernoff-type concentration results for random matrices.