Deep Learning in High Dimension: Neural Network Approximation of Analytic Maps of Gaussians

01.12.2021 18:00 - 19:00

Christoph Schwab (ETH Zürich)

Abstract: For artificial deep neural networks with ReLU activation, we prove new expression rate bounds for parametric, analytic functions where the parameter dimension could be infinite.
Approximation rates are in mean square on the unbounded parameter range with respect to product gaussian measure. Approximation rate bounds are free from the CoD, and determined by summability of Wiener-Hermite PC expansion coefficients. Sufficient conditions for summability are quantified holomorphy on products of strips in the complex domain. Applications comprise DNN expression rate bounds of deep-NNs for response surfaces of elliptic PDEs with log-gaussian random field inputs, and for the posterior densities of the corresponding Bayesian inverse problems.
Variants of proofs which are constructive are outlined.
(joint work with Jakob Zech, University of Heidelberg, Germany, and with Dinh Dung and Nguyen Van Kien, Hanoi, Vietnam)

References: https://math.ethz.ch/sam/research/reports.html?id=982

https://zoom.us/j/93911737384

Organiser:
P. Petersen (U Wien)
Location:
Zoom Meeting