Loading…
Session: Nonsmooth PDE Constrained Optimization: Algorithms, Analysis and Applications Part 1
Chair: Denis Ridzal
Cluster: PDE-constrained Optimization

Talk 1: Digital Twins and Optimization Under Uncertainty 
Speaker: Harbir Antil
Abstract: This talk begins by studying the role of risk measures, such as Conditional Value at Risk (CVaR), in identifying weaknesses in Structural Digital Twins. CVaR is shown to outperform classical expectation (risk-neutral setting) for such problems. Nevertheless, this framework assumes a knowledge of the underlying distribution. To overcome such a requirement, we introduce the notion of Rockafellian relaxation which can handle realistic distributional ambiguities. Both, risk-neutral and risk-averse formulations are discussed. Applications to real life digital twins of bridges, dams, and wind turbines are considered. Time permitting, both the static and dynamic problems arising in civil and mechanical engineering will be presented.

Talk 2: Infinite-horizon optimal control of operator equations with random inputs
Speaker: Olena Melnikov
Abstract: We investigate infinite-horizon discounted optimal control problems governed by operator equations with random inputs. Our framework includes parameterized evolution equations, such as those arising from ordinary and partial differential equations. The objective function is risk-neutral, aiming to optimize the expected discounted cost over an infinite time horizon. We establish the existence of optimal solutions. Furthermore, we discuss the convergence of sample-based approximations, demonstrating their effectiveness in approximating the true problem.

Talk 3: Nonuniform derivative-based random weight initialization for neural network optimization
Speaker: Konstantin Pieper
Abstract: Neural networks can alleviate the curse of dimensionality by detecting subspaces in the input data corresponding to large output variability. In order to exploit this, the nonlinear input weights of the network have to align with these directions during network training. As a step on the way to guess these patterns before nonlinear optimization-based neural network regression, we propose nonuniform data-driven parameter distributions for weight initialization. These parameter distributions are developed in the context of non-parametric regression models based on shallow neural networks and employ derivative data of the function to be approximated. We use recent results on the harmonic analysis and sparse representation of fully trained (optimal) neural networks to obtain densities that concentrate in appropriate regions of the input weight space. Then, we suggest simplifications of these exact densities based on approximate derivative data in the input points that allow for very efficient sampling. This leads to performance of random feature models close to optimal networks in several scenarios and compares favorably to conventional uniform random feature models.

Speakers
DR

Denis Ridzal

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
HA

Harbir Antil

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
OM

Olena Melnikov

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
KP

Konstantin Pieper

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 10:30am - 11:45am PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 157 3518 Trousdale Pkwy, 157, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link