Session: Recent progresses in derivative-free optimization I
Chair: Raghu Pasupathy
Cluster: Derivative-free Optimization
Talk 1: Derivative-free Frank-Wolfe recursion for optimization over probability spaces
Speaker: Raghu Pasupathy
Abstract: It has recently been observed that a number of important problem classes, e.g., statistical experimental design, continuous linear programming, and DRO, are usefully seen as constrained optimization problems over the space of probability measures. Various aspects, including the fact that the space of probability measures is not a linear space, make a primal recursion such as Frank-Wolfe (suitably adapted to function on probability spaces) a natural choice for solution. In this talk, we present a framework that suitably combines interior point ideas along with a Frank-Wolf recursion to minimize linearly constrained convex functionals over probability spaces. We emphasize a derivative-free version of this framework that estimates the von Mises derivative through an analogue of finite-differencing in the Euclidean context. We will illustrate through several examples and numerical experiments.
Talk 2: Exploring complexity bounds of model based trust region derivative free methods
Speaker: Katya Scheinberg
Abstract: Model-based trust region derivative free methods pioneered by Powell rely on interpolation models to approximate objective function in a trust region. The quality of this approximation dictates algorithmic progress and is, in turn, dictated by the geometry of the sample set. The methods are designed to trade-off carefully between the "exploration" and "exploitation", i.e. between seeking progress an improving sample set geometry. While these methods have been very successful in practice and have been show to converge, their complexity analysis has been incomplete, especially in terms of dependence on the dimension. We will present an improved complexity analysis for different variants of these methods.
Talk 3: Revisiting the convergence analysis of derivative-free trust region and direct search
Speaker: Cunxin Huang
Abstract: Derivative-Free trust region and direct search are two popular classes of derivative-free optimization methods. In this paper, we propose a unified new perspective for the convergence analysis of these two classes of methods. Specifically, we find that the behavior of an algorithm-determined series will decide the asymptotic convergence, which is a generalization of the existing results under both deterministic and randomized settings.