Session: Distributionally Robust Optimization (DRO) II
Chair: Yiling Zhang
Cluster: nan
Talk 1: Distributionally robust standard quadratic optimization with Wasserstein ambiguity
Speaker: Daniel de Vicente
Abstract: The standard quadratic optimization problem (StQP) consists of minimizing a quadratic form over the standard simplex. If the quadratic form is neither convex nor concave, the StQP is NP-hard. This problem has many interesting applications ranging from portfolio optimization to machine learning. Sometimes, the data matrix is uncertain but some information about its distribution can be inferred, e.g. the first two moments or else a reference distribution (typically, the empirical distribution after sampling). In distributionally robust optimization, the goal is to minimize over all possible distributions in an ambiguity set defined based upon above mentioned characteristics. We will explore two versions: the distributionally robust stochastic StQP focussing on expectations, and the distributionally robust chance constrained StQP, both with an ambiguity set based upon maximal Wasserstein distance to the sampled distribution.
Talk 2: A Primal Perspective on Distributionally Robust Two-Stage Problems with Integer Recourse
Speaker: Yiling Zhang
Abstract: In this talk, we introduce and study a two-stage distributionally two-stage linear problems with integer recourse, where the objective coefficients are random. The random parameters follow the worst-case distribution belonging to an a second-order conic representable ambiguity set of probability distributions. We show that the worst-case recourse objective, under various risk measures, can be formulated as a conic program from a primal perspective. This method also provides additional information on the probability of attaining an integer recourse solution, extending the concept of persistency studied in Bertsimas et al. (2006). Unlike the marginal moment sets used in Bertsimas et al. 2006), the second-order conic representable ambiguity sets in our method offers greater flexibility by incorporating more distributional information. Furthermore, this method enables column constraint generation methods for solving two-stage problems with integer recourse.
Talk 3: Distributionally Robust Nonlinear Optimization
Speaker: Judith Brugman
Abstract: Distributionally robust optimization (DRO) provides a powerful framework for handling uncertainty when only partial information, such as mean, variance and support, is available. Instead of assuming full knowledge of the probability distribution, DRO seeks solutions that perform well under the worst-case distribution within an ambiguity set. While DRO problems can be reformulated as robust optimization (RO) problems, making them more tractable while maintaining theoretical guarantees, solving the resulting RO problem remains challenging. Wiesemann et al. (2014) address this problem for a very rich class of ambiguity sets, but relies on a max-of-linear-functions assumption on the cost function, limiting its applicability. In our work, we extend this approach to a much broader class of cost functions, including all convex and twice continuously differentiable functions. By leveraging the Reformulation-Perspectification Technique with Branch and Bound (RPT-BB) for RO, which combines relaxation-based partitioning with branch-and-bound techniques, we show that DRO problems can be efficiently solved even for highly nonlinear functions. To demonstrate the practical relevance of this approach, I will focus on the appointment scheduling problem, where our method not only generalizes existing results but also improves computational efficiency. I will conclude with a discussion on broader applications of our framework in other domains.