Loading…
Session: Bilevel Optimization for Inverse Problems Part 2
Chair: Juan Carlos de los Reyes
Cluster: PDE-constrained Optimization

Talk 1: Linesearch-enhanced inexact forward–backward methods for bilevel optimization
Speaker: Marco Prato
Abstract: Bilevel optimization problems arise in various real-world applications, often being characterized by the impossibility of having the exact objective function and its gradient available. Developing mathematically sound optimization methods that effectively handle inexact information is crucial for ensuring reliable and efficient solutions. In this talk we propose a line-search based algorithm for solving a bilevel optimization problem, where the approximate gradient and function evaluation obeys an adaptive tolerance rule. Our method is based on implicit differentiation under some standard assumptions, and its main novelty with respect to similar approaches is the well posed, inexact line-search procedure using only approximate function values and adaptive accuracy control. This work is partially supported by the PRIN project 20225STXSB, under the National Recovery and Resilience Plan (NRRP) funded by the European Union - NextGenerationEU.

Talk 2: Tensor train solution to uncertain optimization problems with shared sparsity penalty
Speaker: Akwum Onwunta
Abstract: We develop first- and second-order numerical optimization methods to solve non-smooth optimization problems featuring a shared sparsity penalty, constrained by differential equations with uncertainty. To alleviate the curse of dimensionality, we use tensor product approximations. To handle the non-smoothness of the objective function, we introduce a smoothed version of the shared sparsity objective. We consider both a benchmark elliptic PDE constraint and a more realistic topology optimization problem. We demonstrate that the error converges linearly in iterations and the smoothing parameter and faster than algebraically in the number of degrees of freedom, consisting of the number of quadrature points in one variable and tensor ranks. Moreover, in the topology optimization problem, the smoothed shared sparsity penalty reduces the tensor ranks compared to the unpenalized solution. This enables us to find a sparse high-resolution design under a high-dimensional uncertainty.

Talk 3: TBD
Speaker: Juan Carlos de los Reyes
Abstract: TBD

Speakers
avatar for Marco Prato

Marco Prato

Associate Professor, Università di Modena e Reggio Emilia, Italy
Name: Dr. Marco PratoTitle: Associate Professor in Numerical AnalysisAffiliation: Department of Physics, Informatics and Mathematics, University of Modenanad Reggio Emilia, ItalyBio:Dr. Marco Prato was born in Varazze, on the Italian Riviera, in 1980. He received the MSc Degree and... Read More →
JC

Juan Carlos de los Reyes

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Tuesday July 22, 2025 10:30am - 11:45am PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 157 3518 Trousdale Pkwy, 157, Los Angeles, CA 90089

Attendees (1)


Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link