Loading…
Wednesday July 23, 2025 10:30am - 11:45am PDT
Session: Duality in Optimization for Data Science
Chair: Ahmet Alacaoglu
Cluster: Optimization For Data Science

Talk 1: Addressing Misspecification in Contextual Optimization
Speaker: Jiawei Zhang
Abstract: We study a linear contextual optimization problem where a decision maker has access to historical data and contextual features to learn a cost prediction model aimed at minimizing decision error. We adopt the predict-then-optimize framework for this analysis. Given that perfect model alignment with reality is often unrealistic in practice, we focus on scenarios where the chosen hypothesis set is misspecified. In this context, it remains unclear whether current contextual optimization approaches can effectively address such model misspecification. In this paper, we present a novel integrated learning and optimization approach designed to tackle model misspecification in contextual optimization. This approach offers theoretical generalizability, tractability, and optimality guarantees, along with strong practical performance. Our method involves minimizing a tractable surrogate loss that aligns with the performance value from cost vector predictions, regardless of whether the model misspecified or not, and can be optimized in reasonable time. To our knowledge, no previous work has provided an approach with such guarantees in the context of model misspecification.

Talk 2: Density Estimation from Moments
Speaker: Michael Friedlander
Abstract: We present a maximum entropy method for estimating probability densities from a limited set of moment measurements, with applications to x-ray Thomson scattering in high-energy physics. A stable dual formulation using indirect linear algebra operations yields robust density estimates.

Talk 3: A Dual-Certificate Analysis for Neural Network Optimization Problems
Speaker: Rahul Parhi
Abstract: We consider the problem of optimizing neural networks with common regularization schemes such as weight decay (which corresponds to the standard Tikhonov regularization). In the case of shallow neural networks, it turns out that this non-convex optimization problem can be lifted to a convex optimization problem posed over a a space of Radon measures. This enables us to bring tools from convex analysis to study properties of solutions to these problems. Via a novel dual-certificate analysis of the lifted problem for multivariate and vector-valued neural networks, we prove that solutions to the original non-convex problem are always unique. These unique neural network solutions also have widths (number of neurons) bounded by the number of training data squared, regardless of the level of overparameterization. This result recovers recent observations in the literature that were made only in the univariate case. Furthermore, this result also sheds light on the "critical level" of overparameterization necessary for neural networks.

Speakers
AA

Ahmet Alacaoglu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
JZ

Jiawei Zhang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
MF

Michael Friedlander

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
RP

Rahul Parhi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 10:30am - 11:45am PDT
Taper Hall (THH) 208 3501 Trousdale Pkwy, 208, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link