Loading…
Session: Relaxations of Optimization Problems and Extreme Point Results in Infinite Dimensions
Chair: Rahul Parhi
Cluster: Nonsmooth Optimization

Talk 1: On Extremal Points for Some Vectorial Total Variation Seminorms
Speaker: Daniel Walter
Abstract: We consider the set of extremal points of the generalized unit ball induced by gradient total variation seminorms for vector-valued functions on bounded Euclidean domains. These extremal points are central to the understanding of sparse solutions and sparse optimization algorithms for variational regularization problems posed among such functions. For not fully vectorial cases in which either the domain or the target are one dimensional, or the sum of the total variations of each component is used, we prove that these extremals are fully characterized as in the scalar-valued case, that is, they consist of piecewise constant functions with two regions. For definitions involving more involved matrix norms and in particular spectral norms, which are of interest in image processing, we produce families of examples to show that the resulting set of extremal points is larger and includes piecewise constant functions with more than two regions. We also consider the total deformation induced by the symmetrized gradient, for which minimization with linear constraints appears in problems of determination of limit loads in a number of continuum mechanical models involving plasticity, bringing relevance to the corresponding extremal points. For this case, we show piecewise infinitesimally rigid functions with two pieces to be extremal under mild assumptions. Finally, as an example of an extremal which is not piecewise constant, we prove that unit radial vector fields are extremal for the Frobenius total variation in the plane.

Talk 2: Exact Sparse Representation Recovery for Convex Optimization Problems
Speaker: Marcello Carioni
Abstract: We investigate the recovery of the sparse representation of data in general infinite-dimensional optimization problems regularized by convex functionals. We show that it is possible to define a suitable non-degeneracy condition on the minimal-norm dual certificate, extending the well-established non-degeneracy source condition (NDSC) associated with total variation regularized problems in the space of measures, as introduced in (Duval and Peyré, FoCM, 15:1315-1355, 2015). In our general setting, we need to study how the dual certificate is acting, through the duality product, on the set of extreme points of the ball of the regularizer, seen as a metric space. This justifies the name Metric Non-Degenerate Source Condition (MNDSC). More precisely, we impose a second-order condition on the dual certificate, evaluated on curves with values in small neighbourhoods of a given collection of n extreme points. By assuming the validity of the MNDSC, together with the linear independence of the measurements on these extreme points, we establish that, for a suitable choice of regularization parameters and noise levels, the minimizer of the minimization problem is unique and is uniquely represented as a linear combination of n extreme points. The paper concludes by obtaining explicit formulations of the MNDSC for three problems of interest. First, we examine total variation regularized deconvolution problems, showing that the classical NDSC implies our MNDSC, and recovering a result similar to (Duval and Peyré, FoCM, 15:1315-1355, 2015). Then, we consider 1-dimensional BV functions regularized with their BV-seminorm and pairs of measures regularized with their mutual 1-Wasserstein distance. In each case, we provide explicit versions of the MNDSC and formulate specific sparse representation recovery results.

Talk 3: Extensions of Optimization Problems and Representer Theorems
Speaker: Thibaut Horel
Abstract: We conduct a general investigation of extensions of convex optimization problems in infinite-dimensional spaces, including for example regularized empirical risk minimization and signal reconstruction problems. It turns out that many such problems (such as those minimizing L^1-type norms) do not admit minimizers over their primal space, but do exhibit a minimizer over a minimally extended space (for L^1-type spaces, the extension could be a space of Radon measures). With this observation in hand, we provide a systematic treatment of extensions of optimization problems in the sense of Ioffe and Tikhimirov (Tr. Mosk. Mat. Obs., 1968). In particular, we show how to extend, in a principled manner, a convex optimization problem to its bidual space in a way that preserves the optimal value and such that the extended problem admits a minimizer. The objective function of the extended problem is derived by taking the biconjugate of the original function. Under mild regularity conditions, biconjugation commutes with addition and linear operators, allowing the extended problem to retain the structure of the original. This allows us to extend the scope of recently proposed abstract representer theorems to problems that did not admit a minimizer in their primal space by considering their bidual extension. As a byproduct, the interplay between different extensions provides a fresh perspective on previously studied sparse representation recovery problems. (Joint work with Rahul Parhi)

Speakers
RP

Rahul Parhi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
DW

Daniel Walter

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
MC

Marcello Carioni

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
TH

Thibaut Horel

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 4:15pm - 5:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 158 3518 Trousdale Pkwy, 158, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link