Loading…
Wednesday July 23, 2025 10:30am - 11:45am PDT
Session: Advances in Nonlinear Optimization Methods and Applications
Chair: Wenqi Zhu
Cluster: Nonlinear Optimization

Talk 1: A simple and reliable adaptive trust-region method
Speaker: Fadi Hamad
Abstract: We present an adaptive trust region method for unconstrained optimization that allows inexact solutions to the trust-region subproblems. Our method is a simple variant of the classical trust-region method of \citet{sorensen1982newton}. The method achieves the best possible convergence bound up to an additive log factor, for finding an $\epsilon$-approximate stationary point, i.e., $O( \Delta_f L^{1/2} \epsilon^{-3/2}) + \tilde{O}(1)$ iterations where $L$ is the Lipschitz constant of the Hessian, $\Delta_f$ is the optimality gap, and $\epsilon$ is the termination tolerance for the gradient norm. This improves over existing trust-region methods in terms of its $L$ dependence. We compare our performance with state-of-the-art trust-region (TRU) and cubic regularization (ARC) methods from the GALHAD library on the CUTEst benchmark set on problems with more than 100 variables. We use fewer function evaluations, gradient evaluations and Hessian evaluations than these methods. For instance, the shifted geometric mean of function evaluations improve over TRU and ARC by $1.4$ and $2$ times respectively. Compared to the conference version of this paper \cite{hamad2022consistently}, our revised method includes practical enhancements and a refined subproblems termination criteria. These modifications dramatically improved performance including an order of magnitude reduction in the shifted geometric mean of wall-clock times.}

Talk 2: An Infeasible Primal-Dual Interior-Point Method for Nonconvex Conic Optimization
Speaker: Chuwen Zhang
Abstract: We propose an infeasible primal-dual interior-point method (iPDIPM) for linear-constrained nonconvex conic optimization problems. Existing nonconvex interior-point methods are often pure "primal" and rely on a two-stage strategy that first finds a feasible start. Our approach utilizes the regularized augmented systems that align closely with more practical interior-point solvers such as IPOPT, KNITRO, and MadNLP. In iPDIPM, primal and dual residuals are carefully balanced by controls of regularization and perturbed primal-dual residuals. In spirit, this method is an extension of "the surface of analytic centers" in Mizuno-Todd-Ye (1995) for linear programming. We leverage a Lyapunov-type analysis for self-concordant barriers and show iPDIPM has an $O(\mu^{-3/2})$ worst-case complexity to find $\mu$-approximate KKT points, which matches the pure primal methods with strictly feasible initial points. Extensions to inequality-constrained problems are also discussed.

Talk 3: Quasi-Newton methods for solving general nonlinear equations
Speaker: Chengchang Liu
Abstract: This talk focuses on quasi-Newton methods for solving general nonlinear equations. Unlike existing theory for quasi-Newton methods, which mostly focuses on the case such that the number of nonlinear equations $n$ and the dimension of the variable $d$ are identical, we provide analysis on the case that they can can be different. For $n\geq d$, we establish a fast local linear rate, which can be independent of the condition number. For $n\leq d$, we establish local superlinear rates, which match the results for strongly convex optimization. We will further show how to boost these results by incorporating the block-type quasi-Newton updates.

Speakers
WZ

Wenqi Zhu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
FH

Fadi Hamad

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
CZ

Chuwen Zhang

The University of Chicago
Name: Dr. Slothington "Slow Convergence" McNapfaceTitle: Distinguished Professor of Continuous Optimization & Energy MinimizationAffiliation: The Lush Canopy Institute of Sluggish AlgorithmsBio:Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
CL

Chengchang Liu

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
Wednesday July 23, 2025 10:30am - 11:45am PDT
Taper Hall (THH) 106 3501 Trousdale Pkwy, 106, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link