Optimization Algorithms Beyond Smoothness and Convexity
University of Washington
Stochastic iterative methods lie at the core of large-scale optimization and its modern applications to data science. Though such algorithms are routinely and successfully used in practice on highly irregular problems (e.g., deep learning), few performance guarantees are available outside of smooth or convex settings. In this talk, I will describe a framework for designing and analyzing stochastic gradient-type methods on a large class of nonsmooth and nonconvex problems. The problem class subsumes such important tasks as matrix completion, robust PCA, and minimization of risk measures, while the methods include stochastic subgradient, Gauss Newton, and proximal point iterations. I will describe a number of results, including finite time efficiency estimates, avoidance of extraneous saddle points, and asymptotic normality of averaged iterates.
Tuesday, May 10, 2022
11:00AM Zoom ID 954 6624 3503
Center for Computational Mathematics9500 Gilman Dr. #0112La Jolla, CA 92093-0112Tel: (858)534-9056