Loading…
Session: Recent advances in mixed-integer programming
Chair: Linchuan Wei
Cluster: Interplay Between Continuous and Discrete Optimization

Talk 1: Serial and Parallel Two-Column Probing for Mixed-Integer Programming
Speaker: Yongzheng Dai
Abstract: Probing in mixed-integer programming (MIP) is a technique of temporarily fixing variables to discover implications that are useful to branch-and-cut solvers. Such fixing is typically performed one variable at a time— this paper develops instead a two-column probing scheme that instead fixes a pair of variables per iteration. Although the scheme involves more work per iteration compared to the one-column approach, stronger implied bounds as well as more conflicts identified may compensate. Indeed, our prototype implementation was awarded first prize at the MIP Workshop 2024 Computational Competition on novel presolving approaches. This paper presents the aforementioned (serial) prototype and additionally develops an efficient parallelization, leveraging hardware acceleration to further improve overall solve times. Compared to serial two-column probing, our parallel version sacrifices some strength per-pair probed in exchange for greatly increasing the total number of such probings; computational experiments demonstrate its promise.

Talk 2: Integer Programming for Learning Directed Acyclic Graphs from Non-identifiable Gaussian Models
Speaker: Tong Xu
Abstract: We study the problem of learning directed acyclic graphs from continuous observational data, generated according to a linear Gaussian structural equation model. State-of-the-art structure learning methods for this setting have at least one of the following shortcomings: i) they cannot provide optimality guarantees and can suffer from learning sub-optimal models; ii) they rely on the stringent assumption that the noise is homoscedastic, and hence the underlying model is fully identifiable. We overcome these shortcomings and develop a computationally efficient mixed-integer programming framework for learning medium-sized problems that accounts for arbitrary heteroscedastic noise. We present an early stopping criterion under which we can terminate the branch-and-bound procedure to achieve an asymptotically optimal solution and establish the consistency of this approximate solution. In addition, we show via numerical experiments that our method outperforms state-of-the-art algorithms and is robust to noise heteroscedasticity, whereas the performance of some competing methods deteriorates under strong violations of the identifiability assumption. The software implementation of our method is available as the Python package \emph{micodag}.

Talk 3: Disjunctive Sum of Squares
Speaker: Yixuan Hua
Abstract: We introduce the concept of disjunctive sum of squares for certifying nonnegativity of polynomials. Unlike the popular sum of squares approach where nonnegativity is certified by a single algebraic identity, the disjunctive sum of squares approach certifies nonnegativity with multiple algebraic identities. Our main result is a disjunctive Positivstellensatz proving that we can keep the degree of each algebraic identity as low as the degree of the polynomial whose nonnegativity is in question. Based on this result, we construct a semidefinite programming based converging hierarchy of lower bounds for the problem of minimizing a polynomial over a compact basic semialgebraic set, where the size of the largest semidefinite constraint is fixed throughout the hierarchy. We further prove a second disjunctive Positivstellensatz which leads to an optimization-free hierarchy for polynomial optimization. We specialize this result to the problem of proving copositivity of matrices. Finally, we describe how the disjunctive sum of squares approach can be combined with a branch-and-bound algorithm and we present numerical experiments on polynomial, copositive, and combinatorial optimization problems.

Speakers
YD

Yongzheng Dai

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
TX

Tong Xu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
avatar for Yixuan Hua

Yixuan Hua

Princeton University
I am a third year PhD student in the Operations Research and Financial Engineering (ORFE) department at Princeton University, co-advised by Prof. Amir Ali Ahmadi and Prof. Bartolomeo Stellato. My research focuses on mathematical optimization, with various applications in machine learning... Read More →
Wednesday July 23, 2025 1:15pm - 2:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 155 3518 Trousdale Pkwy, 155, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link