Session: Manifolds, samples, and learning II
Chair: Ralf Zimmermann
Cluster: Optimization on Manifolds
Talk 1: A Riemannian Douglas-Rachford Algorithm for Low-Rank and Row-Sparse Matrix Recovery
Speaker: Lukas Klingbiel
Abstract: In this work, we consider a matrix recovery problem under low-rank and row-sparse constraints using a Riemannian Douglas-Rachford algorithm. We introduce a retraction-based Riemannian Douglas-Rachford based on the Riemannian Douglas-Rachford on symmetric Hadamard manifolds. We show local convergence of this method for nonexpansive reflections and manifolds with locally bounded sectional curvature. We give an explicit form of the algorithm on the fixed-rank manifold to solve the matrix recovery problem. In particular, numerical experiments suggest that we require the minimal number of measurements for the recovery of a low-rank and row-sparse matrix.
Talk 2: Approximating maps into manifolds
Speaker: Simon Jacobsson
Abstract: Many interesting functions arising in applications map into Riemannian manifolds. The distance between two points on a manifold depends on the intrinsic geometry of that manifold. When approximating functions that map into manifolds, it is natural to measure the error on the manifold, rather than in some ambient space where the manifold might be embedded. Especially, when the dimension of the ambient space is much larger than the dimension of the manifold, such as for low rank tensors, it becomes unfeasible to work in the ambient space. In this presentation, we present a scheme to approximate maps into manifolds by first pulling back the problem to the tangent space and then applying a scheme for approximating maps into vector spaces. Our main result is a theorem that bounds the approximation error on the manifold in terms of an error bound on the tangent space and a lower bound for the manifold's sectional curvature. Example applications to Krylov subspaces and low-rank approximation are discussed as well. This is joint work with Raf Vandebril (KU Leuven), Joeri Van der Veken (KU Leuven), and Nick Vannieuwenhoven (KU Leuven).
Talk 3: Second-order differential operators, stochastic differential equations and Brownian motions on embedded manifolds
Speaker: Du Nguyen
Abstract: We provide a framework to simulate Riemannian Brownian and Riemannian Langevin equations on embedded manifolds in global coordinates. We specify the conditions when a manifold M embedded in an inner product space E is an invariant manifold of a stochastic differential equation (SDE) on E, linking it with the notion of second-order differential operators on M. When M is given a Riemannian metric, we derive a simple formula for the Laplace-Beltrami operator in terms of the gradient and Hessian on E and construct the Riemannian Brownian motions on M as solutions of conservative Stratonovich and Ito SDEs on E. We derive explicitly the SDE for Brownian motions on several important manifolds in applications, including left-invariant matrix Lie groups using embedded coordinates, Stiefel, Grassmann and symmetric positive definite (SPD) manifolds. Numerically, we propose three simulation schemes to solve SDEs on manifolds. In addition to the stochastic projection method, to simulate Riemannian Brownian motions, we construct a second-order tangent retraction of the Levi-Civita connection using a given E-tubular retraction. We also propose the retractive Euler-Maruyama method to solve a SDE, taking into account the second-order term of a tangent retraction. We verify numerically that on several Riemannian manifolds, our approach can sample in global coordinates a given distribution as a long-term limit of Riemannian Brownian or Riemannian Langevin equations. This is joint work with Stefan Sommer (Technical University of Denmark)