Loading…
Tuesday July 22, 2025 10:30am - 11:45am PDT
Session: Moment-SOS Hierarchy: From Theory to Computation in the Real World (I)
Chair: Heng Yang
Cluster: Conic and Semidefinite Optimization

Talk 1: Polynomial Matrix Optimization, Matrix-Valued Moments, and Sparsity
Speaker: Jie Wang
Abstract: This talk will introduce the matrix version of the moment-SOS hierarchy for solving polynomial matrix optimization problems. Various types of sparsities will be discussed to improve the scalability of the approach.

Talk 2: Practical non-symmetric cone programming algorithms for sums-of-squares optimization
Speaker: Dávid Papp
Abstract: Sums-of-squares relaxations of polynomial optimization problems are usually solved using off-the-shelf semidefinite programming (SDP) methods, which often leads to at least one of the following two problems when the relaxation order (that is, the degree of the SOS polynomials) is high: (1) the time and space complexity of SDP algorithms grow too fast to be practical; (2) the numerical conditioning of the SDPs is prohibitive to obtain accurate solutions. The talk focuses on how both can be mitigated using polynomial interpolation and replacing all-purpose semidefinite programming algorithms with non-symmetric cone programming methods that can optimize directly over sums-of-squares cones. Generalizations to other nonnegativity certificates (SONC, SAGE) will also be briefly discussed.

Talk 3: A squared smoothing Newton Method for semidefinite programming
Speaker: Ling Liang
Abstract: This paper proposes a squared smoothing Newton method via the Huber smoothing function for solving semidefinite programming problems (SDPs). We first study the fundamental properties of the matrix-valued mapping defined upon the Huber function. Using these results and existing ones in the literature, we then conduct rigorous convergence analysis and establish convergence properties for the proposed algorithm. In particular, we show that the proposed method is well-defined and admits global convergence. Moreover, under suitable regularity conditions, i.e., the primal and dual constraint nondegenerate conditions, the proposed method is shown to have a superlinear convergence rate. To evaluate the practical performance of the algorithm, we conduct extensive numerical experiments for solving various classes of SDPs. Comparison with the state-of-the-art SDP solvers demonstrates that our method is also efficient for computing accurate solutions of SDPs.

Speakers
avatar for Heng Yang

Heng Yang

Assistant Professor, Harvard University
Assistant Professor at Harvard University working on polynomial optimization and semidefinite programming.
JW

Jie Wang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
DP

Dávid Papp

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
LL

Ling Liang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Tuesday July 22, 2025 10:30am - 11:45am PDT
Taper Hall (THH) 214 3501 Trousdale Pkwy, 214, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link