Abstract: A core challenge in mathematical data science is to understand and leverage intrinsic structures of sets. With reference to my research I will describe how in the last decade the focus shifted from explicit structural regularization in inverse problems and related fields to implicit regularization in massively overparametrized machine learning models. I will furthermore discuss the effect of coarse quantization, i.e., representation of real numbers by a finite alphabet of small size, on established results. The parts of my work that will serve as illustration encompass compressed sensing from quantized measurements, covariance estimation from one-bit samples, and the implicit bias of gradient descent in matrix factorization.
Explicit regularization, implicit bias, and the effect of quantization
11.03.2022 10:50 - 11:35
Organiser:
R. I. Boţ
Location: