Projection-Efficient First-Order Methods for Convex Low-Rank Matrix Optimization

13.12.2021 15:30 - 16.12.2021 16:30

Dan Garber (Technion Haifa)

Abstract: Optimization problems in which the goal is to recover a low-rank matrix from given
measurements/data capture many important applications in data science and machine learning such as the
celebrated matrix completion problem, robust principal component analysis and many more. While these
optimization problems are in general NP-Hard, convex relaxations, which are based on trace-norm
regularization, have proven to be, both in terms of statistical theory point of view and empirically, a successful
approach in many cases, and have received significant research interest in the past decade. However, solving
these relaxations in large scale is often not considered practically tractable, since most first-order methods
cannot avoid the need to store high-rank matrices and/or to compute high-rank spectral decompositions (e.g.,
for projecting onto the trace-norm ball) during the optimization process.
In this talk I will try to (partially) answer the question: when do first-order methods for such relaxations are
guaranteed to store/manipulate only low-rank matrices? Mainly, I will show that a (generalized) strict
complementarity condition implies that, at least locally, popular first-order methods indeed converge (often
with their original rate guarantees) while only requiring to store/manipulate low-rank matrices.

Organiser:
R. I. Boț (U Wien), S. Sabach (Technion - Israel Institute of Technology Haifa), M. Staudigl (Maastricht U)
Location:
Zoom Meeting