Loading…
Monday July 21, 2025 4:15pm - 5:30pm PDT
Session: Recent Advances in Large-scale Optimization I
Chair: Salar Fattahi
Cluster: Nonlinear Optimization

Talk 1: Algorithms for nonconvex nonsmooth optimization
Speaker: Jong-Shi Pang
Abstract: This talk will be focused on algorithms for nonconvex nonsmooth optimization.

Talk 2: Discrete Optimization methods for compressing foundation models
Speaker: Rahul Mazumder
Abstract: Foundation models have achieved remarkable performance across various domains, but their large model sizes lead to high computational costs (storage, inference latency, memory, etc). Neural network pruning, roughly categorized as unstructured and structured, aims to reduce these costs by removing less-important parameters while retaining model utility as much as possible. Depending upon available hardware, different types of pruning approaches are useful. In this talk, I will discuss discrete optimization methods to address such problems. At a high-level, these are related to cardinality constrained least squares problems involving billions of variables; and require the development of large-scale algorithms that can run on GPUs.

Talk 3: A Parametric Approach for Solving Convex Quadratic Optimization with Indicators Over Trees
Speaker: Salar Fattahi
Abstract: In this talk, we discuss convex quadratic optimization problems involving indicator variables, each associated with a continuous variable, particularly focusing on scenarios where the matrix defining the quadratic term is positive definite and its sparsity pattern corresponds to the adjacency matrix of a tree graph. We introduce a graph-based dynamic programming algorithm that solves this problem in quadratic time and memory. Central to our algorithm is a precise parametric characterization of the cost function across various nodes of the graph corresponding to distinct variables. Our computational experiments conducted on both synthetic and real-world datasets demonstrate the superior performance of our proposed algorithm compared to existing algorithms and state-of-the-art mixed-integer optimization solvers. An important application of our algorithm is in the real-time inference of Gaussian hidden Markov models from data affected by outlier noise. Using a real on-body accelerometer dataset, we solve instances of this problem with over 30,000 variables in under a minute, and its online variant within milliseconds on a standard computer.

Speakers
JP

Jong-Shi Pang

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
RM

Rahul Mazumder

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
SF

Salar Fattahi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Monday July 21, 2025 4:15pm - 5:30pm PDT
Taper Hall (THH) 116 3501 Trousdale Pkwy, 116, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link