Session: Modern Polynomial Optimization in Data Science I
Chair: Xindong Tang
Cluster: Conic and Semidefinite Optimization
Talk 1: Sparse matrix constrained polynomial optimization
Speaker: Xindong Tang
Abstract: We study sparse matrix Moment-SOS relaxation for solving sparse matrix constrained polynomial optimization. We prove a sufficient and necessary condition for the sparse matrix Moment-SOS relaxations to be tight. How to certify the tightness and how to extract minimizers are also discussed. When the optimization problem is convex, we prove some sufficient conditions for sparse matrix Moment-SOS relaxations to be tight. In particular, we show that each sparse matrix Moment-SOS relaxation is tight when the problem is SOS-convex. Numerical experiments are provided to support the proposed method.
Talk 2: Symmetries in kernel learning
Speaker: Jack Kendrick
Abstract: The talk introduces symmetries in kernel learning, which is a mix of polynomials and representation theory with kernel methods.
Talk 3: Distributional robust optimization with semi-algebraic structure
Speaker: Guoyin Li
Abstract: Real-world optimization often deals with problems involving uncertain input data due to prediction or measurement errors. Distributional robust optimization (DRO) has emerged as an important tool in handling optimization under data uncertainty. In this talk, we show that a class of DRO problems whose loss function enjoys some suitable semi-algebraic structures can be equivalently reformulated as a conic programming problem, under moment or Wasserstein ambiguity set. If time is permitted, we will demonstrate the computational tractability and applicability of our reformulation results through numerical experiments in a portfolio optimization and option pricing model.