Loading…
Session: Numerical Methods
Chair: Masaru Ito
Cluster: nan

Talk 1: Proximal gradient-type method with generalized distances for nonconvex composite optimization
Speaker: Masaru Ito
Abstract: In this talk, we consider a composite optimizaton problem minimizing the sum of two functions f and g. Typical proximal gradient methods rely on descent lemma and the convexity of g, for which the choice of distance-like functions to define the proximal subproblems is constrained by the structure of both f and g. We propose a proximal gradient-type method when f has locally Lipschitz gradient and g is nonconvex. We discuss conditions of distance-like functions allowing their broader choices and ensuring convergence results.

Talk 2: A fixed-point algorithm with matrix splitting for nonlinear second-order cone complementarity problems
Speaker: Shunsuke Hayashi
Abstract: The Second-Order Cone Complementarity Problem (SOCCP) is a wide class of problems containing the nonlinear complementarity problem (NCP) and the second-order cone programming problem (SOCP). Recently, Li et al. reformulated the linear SOCCP into a fixed-point problem by using matrix splitting, and constructed the Anderson-accelerated preconditioned modulus approach. In this study, we extend their approach to nonlinear SOCCPs. To solve such problems, we combine a matrix splitting with a fixed-point algorithm. We also present an approach with Anderson acceleration to enhance the convergence performance. We further show the convergence property under appropriate assumptions. Finally, we report some numerical results to evaluate the effectiveness of the algorithm.

Talk 3: A Simple yet Highly Accurate Prediction-Correction Algorithm for Time-Varying Optimization
Speaker: Tomoya Kamijima
Abstract: Time-varying optimization problems arise in various applications such as robotics, signal processing, and electronics. We propose SHARP, a simple yet highly accurate prediction-correction algorithm for unconstrained time-varying problems. The prediction step is based on Lagrange interpolation of past solutions, allowing for low computational cost without requiring Hessian matrices or gradients. To enhance stability, especially in non-convex settings, an acceptance condition is introduced to reject excessively large updates. We provide theoretical guarantees for a small tracking error and demonstrate the superior performance of SHARP through numerical experiments.

Speakers
MI

Masaru Ito

Associate Professor, Nihon University
SH

Shunsuke Hayashi

Name: Dr. Slothington "Slow Convergence" McNapface Title: Distinguished Professor of Continuous Optimization & Energy Minimization Affiliation: The Lush Canopy Institute of Sluggish Algorithms Bio: Dr. Slothington McNapface is a leading expert in continuous optimization, specializing... Read More →
avatar for Tomoya Kamijima

Tomoya Kamijima

Doctor student, The University of Tokyo
Name: Tomoya KamijimaAffiliation: The University of TokyoBio:I completed a master's degree at the University of Tokyo in 2025.Now I am pursuing a Ph.D.I am interested in continuous optimization, especially,time-varying optimization,online optimization,ODE approach.
Thursday July 24, 2025 4:15pm - 5:30pm PDT
Joseph Medicine Crow Center for International and Public Affairs (DMC) 258 3518 Trousdale Pkwy, 258, Los Angeles, CA 90089

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link