Session: Derivative-free optimization for special classes of problems II
Chair: Ana Luísa Custódio
Cluster: Derivative-free Optimization
Talk 1: A direct multisearch approach for many-objective derivative-free optimization
Speaker: Ana Luísa Custódio
Abstract: From a theoretical point of view, Direct Multisearch (DMS) was developed for continuous constrained multiobjective derivative-free optimization, with a general number of components in the objective function. However, the algorithmic performance was never assessed in problems with more than three objectives. In this work, we propose DMS-Reduction, a variant of DMS based on reduction approaches, using correlation and sketching techniques. This approach is an attempt to tackle larger problems, in what respects the number of components of the objective function and the number of variables. At each iteration, the reduction in the number of components of the objective function to be optimized has the possible additional benefit of conducting to a reduction in the number of variables to be considered, since there could be the case that not all variables are related to the objective function components selected. We will detail the proposed algorithmic structure and report promising numerical results in addressing many-objective optimization problems.
Talk 2: Optimal zeroth-order methods for bilevel optimization
Speaker: Saeed Ghadimi
Abstract: In this talk, we present fully zeroth-order stochastic approximation algorithms for solving stochastic bilevel optimization problems assuming that neither the upper/lower loss functions, nor their unbiased gradient estimates are available. To do so, we first generalize the Gaussian convolution technique to the functions with two block variables and establish all corresponding relationships between such functions and their smoothed Gaussian approximations. By using these results, we estimate the first- and second-order derivatives of the objective functions and provide a fully zeroth-order double-loop algorithm whose sample complexity is optimal in terms of dependence on the target accuracy while polynomially dependent on the problem dimension. Furthermore, by using recent developments in designing fully first-order methods for bilevel optimization, we provide our second fully zeroth-order bilevel optimization algorithm whose sample complexity is optimal in terms of both the target accuracy and the problem dimension.
Talk 3: Derivative-free Frank-Wolfe recursion for optimization over probability spaces
Speaker: Raghu Pasupathy
Abstract: It has recently been observed that a number of important problem classes, e.g., statistical experimental design, continuous linear programming, and DRO, are usefully seen as constrained optimization problems over the space of probability measures. Various aspects, including the fact that the space of probability measures is not a linear space, make a primal recursion such as Frank-Wolfe (suitably adapted to function on probability spaces) a natural choice for solution. In this talk, we present a framework that suitably combines interior point ideas along with a Frank-Wolf recursion to minimize linearly constrained convex functionals over probability spaces. We emphasize a derivative-free version of this framework that estimates the von Mises derivative through an analogue of finite-differencing in the Euclidean context. We will illustrate through several examples and numerical experiments.