Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Postdoc Seminar

Dr. Rishabh Dixit

University of California San Diego

Accelerated gradient methods for nonconvex optimization : asymptotic dynamics and saddle escape

Abstract:

This talk focuses on the problem of understanding the behavior of a general class of accelerated gradient methods on smooth nonconvex functions. Motivated by some recent works that have proposed effective algorithms, based on Polyak’s heavy ball method and the Nesterov accelerated gradient method, to achieve convergence to a local minimum of nonconvex functions, we describe a broad class of Nesterov-type accelerated methods and put forth a rigorous study of these methods encompassing the escape from saddle points and convergence to local minima through an asymptotic analysis. In the asymptotic regime, we first answer an open question of whether Nesterov’s accelerated gradient method (NAG) with variable momentum parameters avoids strict saddle points almost surely with respect to a random initialization and a random step-size. We then develop two metrics of asymptotic rates of convergence and divergence, and evaluate these two metrics for several popular standard accelerated methods such as the NAG and Nesterov’s accelerated gradient with constant momentum (NCM) near strict saddle points. Theoretical results on the asymptotic behavior are demonstrated on the phase retrieval problem.

June 13, 2025

3:00 PM

APM 6402

****************************