Loading…
Monday July 21, 2025 1:15pm - 2:30pm PDT
Session: Advances in Domain-Specific Languages for Optimization
Chair: Steven Diamond
Cluster: Computational Software

Talk 1: Ten Years of CVXPY
Speaker: William Zhang
Abstract: The CVXPY project has been under development for over ten years. The initial motivation was to reproduce in Python the functionality of the CVX MATLAB package. However, early CVXPY design decisions diverged from CVX, with a greater emphasis on modular problem construction, expression trees, and parameterized problems. We explore how these initial design decisions led to later research on object oriented optimization, matrix-free optimization, and differentiable optimization. We conclude with an overview of upcoming CVXPY features.

Talk 2: CvxLean, a convex optimization modeling framework based on the Lean 4 proof assistant
Speaker: Paul Jackson
Abstract: A proof assistant is a computer environment in which mathematical definitions, theorems and proofs can be expressed, developed and checked in a completely formal way. CvxLean at heart is a realization of the disciplined convex programming paradigm in the proof assistant Lean 4.  As with other DCP frameworks, optimization problems can be initially expressed in a language of atomic functions.  Rules are implemented for automatically inferring problems are convex, and automatic transformations are supported for reducing problems to conic form.  Currently CvxLean uses Mosek to solve these reduced problems. Unlike other frameworks, with CvxLean, the convexity rules and transformation rewrites all must be formally proven to be correct, giving the user extra confidence in the reduction process.  Also, initial problems do not have to be directly expressed in terms of atomic functions and they do not have to satisfy the rules for inferring convexity.  Rather, users can draw on a range of standard definitions available in Lean's mathematical library.  If this is done, standard Lean machinery can be used to formally manipulate problems into recognizably-convex forms using the atomic functions. These manipulations can be tedious to guide, and recent work has explored using e-graph rewriting to discover them automatically.

Talk 3: Disciplined Saddle Programming
Speaker: Philipp Schiele
Abstract: We consider convex-concave saddle point problems, and more generally convex optimization problems we refer to as saddle problems, which include the partial supremum or infimum of convex-concave saddle functions. Saddle problems arise in a wide range of applications, including game theory, machine learning, and finance. It is well known that a saddle problem can be reduced to a single convex optimization problem by dualizing either the convex (min) or concave (max) objectives, reducing a min-max problem into a min-min (or max-max) problem. Carrying out this conversion by hand can be tedious and error prone. In this paper we introduce disciplined saddle programming (DSP), a domain specific language (DSL) for specifying saddle problems, for which the dualizing trick can be automated. The language and methods are based on recent work by Juditsky and Nemirovski, who developed the idea of conic-representable saddle point programs, and showed how to carry out the required dualization automatically using conic duality. Juditsky and Nemirovski's conic representation of saddle problems extends Nesterov and Nemirovski's earlier development of conic representable convex problems; DSP can be thought of as extending disciplined convex programming (DCP) to saddle problems. Just as DCP makes it easy for users to formulate and solve complex convex problems, DSP allows users to easily formulate and solve saddle problems. Our method is implemented in an open-source package, also called DSP.

Speakers
SD

Steven Diamond

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
PJ

Paul Jackson

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
PS

Philipp Schiele

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Monday July 21, 2025 1:15pm - 2:30pm PDT
Taper Hall (THH) 108 3501 Trousdale Pkwy, 108, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link