Universal sparsity of deep ReLU neural network

09.05.2019 15:00 - 16:30

Dennis Elbrächter (Univ. Wien)

Abstract:

We introduce (or assimilate) a number of key concepts, which allows us to compare neural networks to classical representation systems (meaning e.g. wavelets, shearlets, and Gabor systems, or more generally any system generated from some mother function through translation, dilation and modulation). This enables us to establish that any function class is (asymptotically) at least as sparse w.r.t. ReLU neural networks, as it is in any 'reasonable' classical representation system.

Organiser:

KH. Gröchenig

Location:

SR 10, 2. OG., OMP 1