[Home]   [  News]   [  Events]   [  People]   [  Research]   [  Education]   [Visitor Info]   [UCSD Only]   [Admin]
Home > Events > CCoM > Abstract
Search this site:

Minimum-norm Perturbations and Regularization in Modified Newton Methods for Unconstrained Optimization

Jeb Runnoe
UCSD

Abstract:

Modified Newton methods are designed to extend the desirable properties of classical Newton method to a wider class of optimization problems. If the Hessian of the objective function is singular at the solution, these methods tend to behave like gradient descent and rapid local convergence is lost. An adaptive regularization technique is described that yields a modified Newton method that retains superlinear local convergence on non-convex problems without the nonsingularity assumption at the solution. The minimum norm perturbation and symmetric indefinite factorization used to construct a sufficiently positive definite approximate Hessian are discussed, and numerical results comparing regularized and standard modified Newton methods will be presented. Lastly, a well-behaved pathological example will be used to illustrate an assumption required for superlinear convergence.

Tuesday, March 14, 2023
11:00AM AP&M 2402 and Zoom ID 994 0149 1091