Circumventing the curse of dimensionality with deep neural networks

12.10.2022 18:00 - 19:00

Sophie Langer (University of Twente)

Abstract: Although the application of deep neural networks to real-world problems has become ubiquitous, the question of why they are so effective has not yet been satisfactorily answered. However, some progress has been made in establishing an underlying mathematical foundation. This talk surveys results on statistical risk bounds of deep neural networks. In particular, we focus on the question of when neural networks bypass the curse of dimensionality. Here we discuss results for vanilla feedforward and convolutional neural networks as well as regression and classification settings.
This talk is based on several joint works with Alina Braun, Michael Kohler, Adam Krzyzak, Johannes Schmidt-Hieber and Harro Walk.

Zoom Link: see website.

Organiser:
Philipp Petersen (Universität Wien)
Location:
Zoom Meeting