Loading…
Monday July 21, 2025 10:30am - 11:45am PDT
Session: Applications of polynomial optimization to data analysis I
Chair: Victor Magron
Cluster: Conic and Semidefinite Optimization

Talk 1: Learning polynomial Lyapunov functions from fixed data
Speaker: Oumayma Khattabi
Abstract: Stability analysis, a cornerstone of control theory, usually relies on an explicit system model. Recently, a surge of data-driven control methods occurred, due to data abundance, high compu- tational power and model shortcomings (inaccuracies, model complexity...). This has given cause to new frameworks relying less on modelling and more on data. In particular, optimization-based frameworks, that minimize generalization errors, play a key role in this area of research, see e.g. Kernel Predictive Control (KPC) and Data-enabled Predictive Control (DeePC). More specifically, polynomial optimization and sum-of-squares (SoS) programming found an important application in this field, in a recent contribution by T. Martin and F. Allg¨ower, relying on polynomial repre- sentation of model-free nonlinear systems, through a set-membership characterization for Taylor polynomials derived from a noisy dataset. For better accuracy, multiple Taylor polynomials are combined into a piecewise polynomial representation, which enhances system property inference and allows for the verification of dissipativity properties. In this context, we propose a novel data- driven methodology to compute certified inner approximations of a region of attraction, for an equilibrium point associated to a dynamical system with unknown model. The approach requires an input-output dataset and information on the variations of the dynamics (e.g. Lipschitz bounds), and returns a Lyapunov function, valid for any dynamical system that matches the dataset and bounds given. To perform this task, the (compact) admissible state space is first partitioned into an appropriate tessellation, after which a polynomial Lyapunov candidate is assigned to each of the resulting cells. The Lyapunov condition is enforced on each cell, and complemented with boundary conditions enforcing continuity of the resulting global, piecewise polynomial Lyapunov candidate. A key contribution is that the Lyapunov condition is split in learning subproblems, following the observation that the more datapoints, the more difficult to analyze the uncertainty set for the ground truth. The whole learning problem can be recast under the form of an SoS programming problem, resulting in semidefinite programming problems (SDP) of increasing size. Interestingly, thanks to our data-wise splitting, the special case of degree one, i.e. piecewise-affine Lyapunov candidates, can be relaxed into a second order cone programming problem (SOCP) while main- taining convergence guarantees, resulting in much faster computations than the higher degree SDP formulations. Another key contribution of this work, for higher degrees of the polynomial, is the inclusion of Lagrangian duality, which hasn’t figured in previous works in data-driven SoS program- ming for dynamical systems. This approach opens the door to a probabilistic interpretation of the methodology.

Talk 2: A sparsified Christoffel function for high-dimensional inference
Speaker: Lucas Slot
Abstract: Christoffel polynomials are classical tools from approximation theory. They can be used to estimate the (compact) support of a measure based on its low-degree moments. Recently, they have been applied to problems in data science, including outlier detection and support inference. A major downside of Christoffel polynomials in such applications is the fact that, in order to compute their coefficients, one must invert a matrix whose size grows rapidly with the dimension. In this talk, we propose a modification of the Christoffel polynomial which is significantly cheaper to compute, but retains many of its desirable properties. Our approach relies on sparsity of the underlying measure, described by a graphical model. The complexity of our modification depends on the treewidth of this model. Based on joint work with Jean-Bernard Lasserre.

Talk 3: Verifying Properties of Binary Neural Networks Using Sparse Polynomial Optimization
Speaker: Srećko Ðurašinović
Abstract: In this talk, we explore methods for verifying the properties of Binary Neural Networks (BNNs), focusing on robustness against adversarial attacks. Despite their lower computational and memory needs, BNNs, like their full-precision counterparts, are also sensitive to input perturbations. Established methods for solving this problem are predominantly based on Satisfiability Modulo Theories and Mixed-Integer Linear Programming techniques, which often face scalability issues. We introduce an alternative approach using Semidefinite Programming relaxations derived from sparse Polynomial Optimization. Our approach, compatible with continuous input space, not only mitigates numerical issues associated with floating-point calculations but also enhances verification scalability through the strategic use of tighter first-order semidefinite relaxations. We demonstrate the effectiveness of our method in verifying robustness against both infinity-norm and L2-norm based adversarial attacks.

Speakers
avatar for Oumayma Khattabi

Oumayma Khattabi

PhD student, Paris-Saclay University
I am currently working on stability analysis of dynamic systems using data.
LS

Lucas Slot

Postdoc, ETH Zurich
avatar for Srećko Ðurašinović

Srećko Ðurašinović

PhD Student, Nanyang Technological University, Singapore
Areas of interest:- Sparse Polynomial Optimization- Neural Network Verification- Christoffel-Darboux Kernels
Monday July 21, 2025 10:30am - 11:45am PDT
Taper Hall (THH) 119 3501 Trousdale Pkwy, 119, Los Angeles, CA 90089

Attendees (2)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link