Loading…
Wednesday July 23, 2025 10:30am - 11:45am PDT
Session: Duality in Optimization for Data Science
Chair: Ahmet Alacaoglu; Yura Malitsky; Stephen J. Wright
Cluster: Optimization For Data Science

Talk 1: Addressing Misspecification in Contextual Optimization
Speaker: Jiawei Zhang
Abstract: We study a linear contextual optimization problem where a decision maker has access to historical data and contextual features to learn a cost prediction model aimed at minimizing decision error. We adopt the predict-then-optimize framework for this analysis. Given that perfect model alignment with reality is often unrealistic in practice, we focus on scenarios where the chosen hypothesis set is misspecified. In this context, it remains unclear whether current contextual optimization approaches can effectively address such model misspecification. In this paper, we present a novel integrated learning and optimization approach designed to tackle model misspecification in contextual optimization. This approach offers theoretical generalizability, tractability, and optimality guarantees, along with strong practical performance. Our method involves minimizing a tractable surrogate loss that aligns with the performance value from cost vector predictions, regardless of whether the model misspecified or not, and can be optimized in reasonable time. To our knowledge, no previous work has provided an approach with such guarantees in the context of model misspecification.

Talk 2: Density Estimation from Moments
Speaker: Michael Friedlander
Abstract: We present a maximum entropy method for estimating probability densities from a limited set of moment measurements, with applications to x-ray Thomson scattering in high-energy physics. A stable dual formulation using indirect linear algebra operations yields robust density estimates.

Talk 3: A Dual-Certificate Analysis for Neural Network Optimization Problems
Speaker: Rahul Parhi
Abstract: We consider the problem of optimizing neural networks with common regularization schemes such as weight decay (which corresponds to the standard Tikhonov regularization). In the case of shallow neural networks, it turns out that this non-convex optimization problem can be lifted to a convex optimization problem posed over a a space of Radon measures. This enables us to bring tools from convex analysis to study properties of solutions to these problems. Via a novel dual-certificate analysis of the lifted problem for multivariate and vector-valued neural networks, we recover several recent observations on the structure of global minima to the nonconvex optimization problem that were previously only made in the univariate case. Furthermore, this result also sheds light on the challenges that arise in the study of high-dimensional data-fitting problems compared to the simplified univariate setting. (This talk is based on joint work with Greg Ongie)

Speakers
avatar for Stephen J. Wright

Stephen J. Wright

UW-Madison
Stephen J. Wright is the George B. Dantzig Professor of Computer Sciences, Sheldon Lubar Chair of Computer Sciences, and Hilldale Professor at the University of Wisconsin-Madison. He also serves as Chair of the Computer Sciences Department. His research is in computational optimization... Read More →
JZ

Jiawei Zhang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
MF

Michael Friedlander

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
AA

Ahmet Alacaoglu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
RP

Rahul Parhi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
YM

Yura Malitsky

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 10:30am - 11:45am PDT
Taper Hall (THH) 208 3501 Trousdale Pkwy, 208, Los Angeles, CA 90089

Attendees (3)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link