Stability and the Role of Overparameterization in Low-Rank Matrix Recovery Problems

17.03.2022 14:50 - 15:35

Dominik Stöger (Katholische Universität Eichstätt-Ingolstadt)

 

Abstract: Low-rank matrix recovery from structured measurements has been a topic of intense study in the last decade with many important applications spanning matrix completion in recommender engines to blind deconvolution in imaging and wireless communications. An important benchmark method to solve these problems is to minimize the nuclear norm, a convex proxy for the rank. In the first part of my talk, I will discuss how blind deconvolution, a problem that arises in wireless communications, can be formulated within this framework. However, due to the highly structured nature of the measurement process, a rigorous analysis is challenging. I will discuss the noise-robustness of the nuclear-norm minimization approach for this problem. In the second part of my talk, I will focus on the problem of learning a low-rank matrix from a few observations via a non-convex and overparameterized model.

I will show that gradient descent with small random initialization finds the underlying low-rank matrix despite the presence of many different global optima. Notably, this analysis is not in the “lazy training” regime and is based on intriguing phenomena uncovering the critical role of small random initialization: a few iterations of gradient descent behaves akin to popular spectral methods. I will also discuss how this connects to the phenomenon of implicit regularization and generalization in overparameterized models in modern machine learning.

 

Organiser:

R. I. Boţ

Location:
HS 14, 2. OG, OMP1