Loading…
Session: Techniques for Solving High-Dimensional and Nonlinear Optimization Problems
Chair: Wenqi Zhu
Cluster: Nonlinear Optimization

Talk 1: Solving exact and noisy rank-one tensor completion with semidefinite programming
Speaker: Zhuorui Li
Abstract: Consider the problem of retrieving a d-th order rank-one tensor given (exact or noisy) values of some of its entries. We address this problem via semidefinite programming (SDP). Concretely, we propose SDP relaxations involving a N-by-N matrix variable where N is the number of entries of the tensor. We show that our SDPs solve the exact and noisy rank-one tensor completion problems when the observations satisfy a deterministic combinatorial condition, which we call square restricted propagation. We prove that the square restricted propagation condition is satisfied with high probability when the number of uniformly random observations is more than certain quantity. Moreover, for matrix completion, the square restricted propagation is satisfied precisely when the completion problem admits a unique solution. Preliminary computational experiments show that our methods allow to solve tensor completion problems using significantly less observations than alternative methods.

Talk 2: Numerical Solution for Nonlinear 4D Variational Data Assimilation (4D-Var) via ADMM
Speaker: Bowen Li
Abstract: The four-dimensional variational data assimilation (4D-Var) has emerged as an important methodology, widely used in numerical weather prediction, oceanographic modeling, and cli- mate forecasting. Classical unconstrained gradient-based algorithms often struggle with local minima, making their numerical performance highly sensitive to the initial guess. In this study, we exploit the separable structure of the 4D-Var problem to propose a practical variant of the alternating direction method of multipliers (ADMM), referred to as the linearized multi-block ADMM with regularization. Unlike classical first-order optimization methods that primarily focus on initial conditions, our approach derives the Euler-Lagrange equation for the entire dynamical system, enabling more comprehensive and effective utilization of observational data. When the initial condition is poorly chosen, the argmin operation steers the iteration towards the observational data, thereby reducing sensitivity to the initial guess. The quadratic subproblems further simplify the solution process, while the parallel structure enhances computational efficiency, especially when utilizing modern hardware. To validate our approach, we demonstrate its superior performance using the Lorenz system, even in the presence of noisy observational data. Furthermore, we showcase the effectiveness of the linearized multi-block ADMM with regularization in solving the 4D-Var problems for the viscous Burgers’ equation, across various numerical schemes, including finite difference, finite element, and spectral methods. Finally, we illustrate the recovery of dynamics under noisy observational data in a 2D turbulence scenario, particularly focusing on vorticity concentration, highlighting the robustness of our algorithm in handling complex physical phenomena.

Talk 3: Generalized Nash equilibrium problems with quasi-linear constraints
Speaker: Jiyoung Choi
Abstract: We discuss generalized Nash equilibrium problems (GNEPs) that are given by polynomial functions. We consider cases each player's constraints are linear in their own strategy vectors. For this kind of GNEPs, the KKT set can be conveniently represented as a union of simpler sets, by using partial Lagrange multiplier expressions. This representation produces a set of branch polynomial optimization problems, each of which can be conveniently solved by Moment-SOS relaxations. We give a method to compute all generalized Nash equilibria, under some genericity assumptions. Extensive numerical experiments are provided to demonstrate the efficiency of the proposed method.

Speakers
WZ

Wenqi Zhu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
ZL

Zhuorui Li

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
avatar for Bowen Li

Bowen Li

Name: Bowen LiPhD CandidateAffiliation: Academy of Mathematics and Systems Sciences, Chinese Academy of SciencesBio:Bowen Li is currently working toward the Ph.D. degree at the Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, China. His research interests... Read More →
JC

Jiyoung Choi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 157 3518 Trousdale Pkwy, 157, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link