Session: Derivative-free optimization for special classes of problems I
Chair: Clément Royer
Cluster: Derivative-free Optimization
Talk 1: A derivative-free algorithm for continuous submodular optimization
Speaker: Clément Royer
Abstract: Submodular functions are a classical concept of discrete optimization, that can also be extended to the continuous setting. In particular, the class of continuous submodular functions encompasses some nonconvex functions arising in natural language processing, which partly explains renewed interest for this topic in recent years. In this talk, we propose a derivative-free algorithm for submodular optimization over compact sets, adapted from a classical framework for bound-constrained derivative-free optimization. By leveraging properties of submodular functions, we obtain complexity guarantees for this method, that represent a significant improvement over guarantees in the general, nonconvex setting. We then investigate the practical behavior of our method on our problems of interest.
Talk 2: The cosine measure of a function
Speaker: Gabriel Jarry-Bolduc
Abstract: The cosine measure of a set of vectors is a valuable tool in derivative-free optimization to judge the quality of a set of vectors. It gives information on how uniformly the set of vectors is covering the space R^n. A set of vectors is a positive spanning set of R^n if and only if its cosine measure is greater than zero. An important property of positive spanning sets is that when the gradient of a function at a point is well-defined and not equal to the zero vector, then there is at least one descent direction (ascent direction) of the function at the point contained in the set. This is not necessarily true if the gradient is equal to the zero vector or if the gradient does not exist. To characterize the previous two cases, the novel concept of cosine measure of a function is introduced in this talk. It provides an infimum on the value of the cosine measure of a set of vectors guaranteed to contain a descent direction of the function at the point of interest. It is shown how to theoretically compute the cosine measure of a function for popular classes of nonsmooth functions.
Talk 3: Using resilient positive spanning sets to deal with stragglers
Speaker: Sébastien Kerleau
Abstract: Positive spanning sets (PSSs) are families of vectors that span a given linear space through non-negative linear combinations. Such sets are of particular interest for their use in derivative-free optimization algorithms. In that context, the cost of determining the value of the objective function at a given point can be particularly expensive, taking up to weeks for a single function evaluation. Although time can partly be saved by conducting multiple function evaluations in parallel, the issue of dealing with stragglers - function evaluations that take significantly longer than others - remains to be solved. With that goal in mind, this talk will study a subclass of PSSs whose properties can allow for an improvement of the classical DFO algorithms, making them resilient to the presence of stragglers. The talk will end with numerical experiments studying the efficiency of the new-found resilient algorithm.