[Home]   [  News]   [  Events]   [  People]   [  Research]   [  Education]   [Visitor Info]   [UCSD Only]   [Admin]
Home > Research > Publications > Abstract
Search this site:

A Limited-Memory Reduced Hessian Method for Bound-Constrained Optimization

by Michael W. Ferry, Philip E. Gill, Elizabeth Wong, Minxin Zhang

Abstract:

Quasi-Newton methods for unconstrained optimization accumulate approximate curvature in a sequence of expanding subspaces. This allows an approximate Hessian to be represented using a smaller reduced Hessian matrix that increases in dimension at each iteration. When the number of variables is large, this feature may be used to define limited-memory reduced-Hessian (L-RH) methods in which the dimension of the reduced Hessian is limited to save storage. In this paper a limited-memory reduced-Hessian method is proposed for the solution of large-scale optimization problems with upper and lower bounds on the variables. The method uses a projected-search method to identify the variables on their bounds at a solution. Conventional projected-search methods are restricted to use an Armijo-like line search. However, by modifying the line-search conditions, a new projected line search based on the Wolfe conditions is used that retains many of the benefits of a Wolfe line search in the unconstrained case. Numerical results are presented for the software package LRHB, which implements a limited-memory reduced-Hessian method based on the Broyden-Fletcher-Goldfarb-Shanno (BFGS) approximate Hessian. It is shown that L-RH-B is competitive with the code {L-BFGS-B} on the unconstrained and bound-constrained problems in the CUTEst test collection.

UCSD-CCoM-21-01.pdf   May 2021