Dynamic learning for stochastic processes: neural networks, reservoir computing systems and applications to mathematical finance

19.01.2021 14:50 - 15:35

Lukas Gonon (LMU München)

Abstract: In this talk I present recent mathematical results regarding different dynamic learning techniques for stochastic processes. I will also elaborate on the central role that Rademacher complexities play in the proofs of these results.

The talk begins with a discussion of our joint research on deep neural network expression rates for option prices in high-dimensional exponential Lévy models. We show that under mild growth conditions on the Lévy triplets deep neural networks are able to approximate option prices without the curse of dimensionality. The talk then continues with results from our joint research on approximation and learning based on random recurrent neural networks. We obtain high-probability bounds on the approximation error in terms of the network parameters and generalization error bounds for weakly dependent input data. Notably, these results imply a universal approximation theorem for random recurrent neural networks. In closing, I will elaborate on further research directions.

Fakultät für Mathematik
Online via Zoom