Minimum-norm Perturbations and Regularization in Modified Newton Methods for Unconstrained Optimization
Modified Newton methods are designed to extend the desirable properties of classical Newton method to a wider class of optimization problems. If the Hessian of the objective function is singular at the solution, these methods tend to behave like gradient descent and rapid local convergence is lost. An adaptive regularization technique is described that yields a modified Newton method that retains superlinear local convergence on non-convex problems without the nonsingularity assumption at the solution. The minimum norm perturbation and symmetric indefinite factorization used to construct a sufficiently positive definite approximate Hessian are discussed, and numerical results comparing regularized and standard modified Newton methods will be presented. Lastly, a well-behaved pathological example will be used to illustrate an assumption required for superlinear convergence.
Tuesday, March 14, 2023
11:00AM AP&M 2402 and Zoom ID 994 0149 1091
Center for Computational Mathematics9500 Gilman Dr. #0112La Jolla, CA 92093-0112Tel: (858)534-9056