Department of Mathematics,
University of California San Diego
****************************
Math 278B: Mathematics of Information, Data, and Signals
Anil Kamber
UCSD
On the Loss-Landscape Geometry of Deep Matrix Factorization
Abstract:
Understanding the loss-landscape geometry near a minimum is key to explaining the implicit bias of gradient-based methods in non-convex optimization problems such as deep neural network training and deep matrix factorization. A central quantity to characterize this geometry is the maximum eigenvalue of the Hessian of the loss. Its precise role has been obfuscated because no exact expressions for this sharpness measure are known in general settings. In this talk, I will present an analysis to derive a closed-form expression for the maximum eigenvalue of the Hessian matrix of an overparameterized deep matrix factorization problem with squared-error loss. I will show that this expression reveals fundamental properties of the loss landscape in deep matrix factorization. For instance, flat minima correspond to spectral-norm balanced minima in depth-2 matrix factorization. Furthermore, I will discuss the implications of this analysis. Beyond this, I will further discuss how l2 regularization reshapes the loss landscape and the set of minimizers of the overparameterized deep matrix factorization problem.
April 24, 2026
11:00 AM
APM 2402
Research Areas
Mathematics of Information, Data, and Signals****************************

