Session: Advances in commercial solvers
Chair: Robert Luce
Cluster: Computational Software
Talk 1: A Whole New Look for CONOPT
Speaker: Steven Dirkse
Abstract: Following GAMS' recent acquisition of CONOPT from ARKI Consulting & Development A/S, this presentation delves into the continuous evolution of this robust nonlinear optimization solver, emphasizing the significant advancements introduced in the latest release and the strategic implications of the new ownership. Recent updates have optimized the active set method at CONOPT’s core, leading to measurable performance improvements across a diverse set of test cases. These enhancements boost computational efficiency, stability and accuracy. The latest iteration of CONOPT introduces new APIs for C++ and Python, opening up new possibilities for a clean, efficient, and robust integration into various software environments and projects requiring nonlinear optimization. Finally, we will demonstrate the practical application of providing derivatives to CONOPT, an important step that is often necessary to achieve the best possible performance.
Talk 2: Continuous Optimization in MATLAB
Speaker: Shengchao Lin
Abstract: This presentation highlights MATLAB's optimization capabilities. Key features include the broad applications of the Optimization and Global Optimization Toolboxes, problem-based optimization setup for better modeling, and code generation and deployment of optimization algorithms. These improvements demonstrate MATLAB's evolving role as a powerful platform for optimization and modeling in engineering and science.
Talk 3: The Latest Developments in the Knitro Optimization Solver
Speaker: Richard Waltz
Abstract: Knitro was originally developed in the 1990s as an interior-point algorithm for general nonlinear, non-convex optimization. Over the years, Knitro evolved into a more general optimization toolbox that includes an Active-Set LP-based solver, a Sequential Quadratic Programming (SQP) solver, specialized LP, QP, and SOCP solvers, a branch-and-bound solver for mixed integer programming, and multi-start heuristics for global optimization. To add to this toolbox of algorithms, we have recently started developing first-order methods that do not require any matrix factorizations. The hope is that these might be able to provide useful solutions to extremely large-scale models where the factorizations in interior-point methods become too expensive. In this talk we will present some of this work, primarily based on Augmented Lagrangian (AL) type methods.