Session: Matrix optimization and geometry of curved spaces
Chair: Florentin Goyens
Cluster: Optimization on Manifolds
Talk 1: Wrapped Gaussian distributions on SDP matrices and their estimation
Speaker: Florian Yger
Abstract: A common assumption for Euclidean data is that the underlying distribution is Gaussian, as the Gaussian distribution is both well-studied and it enables efficient parameter estimation from samples. In this work, we extend the concept of Gaussian distributions to the Riemannian manifold of Symmetric Positive Definite (SPD) matrices, with a focus on generalizing non-isotropic Gaussian distributions for complex statistical modeling in this space. We propose a wrapped Gaussian model, constructed by mapping a Euclidean Gaussian in a tangent space onto the SPD manifold via the exponential map. After defining this wrapped Gaussian distribution, we address the issue of non-identifiability of our model by establishing an equivalence relation among parameter sets that yield the same distribution. We then show that the parameters of a wrapped Gaussian can be estimated from sample data using a maximum likelihood estimator, optimized on a product manifold. Additionally, we reinterpret existing classifiers on the SPD manifold through a probabilistic framework and introduce new probabilistic classifiers based on wrapped Gaussian models. Finally, we present experimental results, both synthetic and real, to evaluate the parameter estimation accuracy and classification performance of our wrapped classifiers.
Talk 2: The ultimate upper bound on the injectivity radius of the Stiefel manifold
Speaker: Simon Mataigne
Abstract: We exhibit conjugate points on the Stiefel manifold endowed with any member of the family of Riemannian metrics introduced by Hüper et al. (2021). This family contains the well-known canonical and Euclidean metrics. An upper bound on the injectivity radius of the Stiefel manifold in the considered metric is then obtained as the minimum between the length of the geodesic along which the points are conjugate and the length of certain geodesic loops. Numerical experiments support the conjecture that the obtained upper bound is in fact equal to the injectivity radius. Authors: Pierre-Antoine Absil, Simon Mataigne
Talk 3: Deterministic and Randomized Direct Search on Riemannian Manifolds with Complexity Guarantees
Speaker: Florentin Goyens
Abstract: In this work, we investigate the problem of minimizing a nonconvex objective function defined on a Riemannian manifold, where derivative information is unavailable or impractical to compute. To address this, we consider the direct search methodology—a class of derivative-free optimization algorithms—in the Riemannian setting. We present a Riemannian adaptation of the deterministic direct search method and analyze its performance, deriving a global complexity bound on the number of function evaluations required to reach an approximate critical point. Building on this, we introduce a randomized variant of the Riemannian direct search algorithm, which operates by generating search directions in a random lower-dimensional subspace of the tangent space. Finally, we present numerical experiments to illustrate the effectiveness and computational advantages of both the deterministic and randomized Riemannian direct search methods. Co-authors: Bastien Cavarretta, Clément Royer and Florian Yger.