Adaptive Convexification of Second-Derivative SQP Methods
The class of SQP methods solve nonlinear constrained optimization problems by solving a related sequence of simpler problems. These SQP subproblems involve minimization of a quadratic model of the Lagrangian function subject to linearized constraints. In contrast to the quasi-Newton approach, which maintains a positive definite Hessian approximation, Second-derivative SQP methods use the exact Hessian of the Lagrangian. In this context, we will discuss an adaptive convexification strategy that makes minimal matrix modifications while ensuring the subproblem iterates are bounded and the solution defines a descent direction for the relevant Lagrangian. This talk will focus on adaptive convexification of stabilized SQP methods, as well as their connection with primal-dual interior methods.
Tuesday, December 5, 2023
11:00AM AP&M 2402 and Zoom ID 915 4615 4399
Center for Computational Mathematics9500 Gilman Dr. #0112La Jolla, CA 92093-0112Tel: (858)534-9056