Session: Optimization on Manifolds
Chair: Maurício Silva Louzeiro
Cluster: Optimization on Manifolds
Talk 1: Variational Problems and Duality on Manifolds
Speaker: Anton Schiela
Abstract: Variational problems play a fundamental role in many areas of applied mathematics, and form the basis of both theoretical results and numerical algorithms. Problems of this class also arise in the context of manifolds and have important applications there. However, to formulate them in an adequate way, refined concepts of differential geometry, in particular a well developed duality theory, are required. In this talk we will give an introduction into these concepts and elaborate, how they can be applied to variational problems, using the example of harmonic mappings. We will also describe some implications of these concepts to the design of numerical solution algorithms.
Talk 2: Newton's method for nonlinear mappings into vector bundles
Speaker: Laura Weigl
Abstract: We consider Newton's method for finding zeros of nonlinear mappings from a manifold $\mathcal X$ into a vector bundle $\mathcal E$. In this setting a connection on $\mathcal E$ is required to render the Newton equation well defined, and a retraction on $\mathcal X$ is needed to compute a Newton update. As applications we will discuss the solution of variational problems involving mappings between manifolds, and, in particular, the numerical computation of geodesics under force fields.
Talk 3: An Adaptive Cubic Regularization quasi-Newton Method on Riemannian Manifolds
Speaker: Maurício Silva Louzeiro
Abstract: A quasi-Newton method with cubic regularization is designed for solving Riemannian unconstrained nonconvex optimization problems. The proposed algorithm is fully adaptive with at most ${\cal O} (\epsilon_g^{-3/2})$ iterations to achieve a gradient smaller than $\epsilon_g$ for given $\epsilon_g$, and at most $\mathcal O(\max\{ \epsilon_g^{-\frac{3}{2}}, \epsilon_H^{-3} \})$ iterations to reach a second-order stationary point respectively. Notably, the proposed algorithm remains applicable even in cases of the gradient and Hessian of the objective function unknown. Numerical experiments are performed with gradient and Hessian being approximated by forward finite-differences to illustrate the theoretical results and numerical comparison.