Tighter Generalization Bounds on Digital Computers via Discrete Optimal Transport

12.06.2024 10:00 - 11:30

Martina Neuman (University of Vienna)

Abstract:

Machine learning models with inputs in a d-dimensional Euclidean space, when implemented on digital computers, generalize, and their generalization gap converges to 0 at a rate of c/N^{1/2} concerning the sample size N. However, the constant c>0 obtained through classical methods can be large in terms of the ambient dimension d and the machine precision, posing a challenge when N is small to realistically large. In this paper, we derive a family of generalization bounds c_m/N^{1/(2\vee m)} tailored for learning models on digital computers, which adapt to both the sample size N and the so-called geometric representation dimension of the discrete learning problem.
Adjusting the parameter m according to N results in significantly tighter generalization bounds for practical sample sizes N, while setting m small maintains the optimal dimension-free worst-case rate of O(1/N^{1/2}).

Furthermore, our adaptive generalization bounds are formulated based on our new non-asymptotic result for concentration of measure in discrete optimal transport, established via leveraging metric embedding arguments.

Organiser:
M. Neuman, S. Schmutzhard-Hoefler
Location:
SR 12, Kolingasse 14