Abstract: In the context of solving PDEs, scientific machine learning has recently seen notable success in so-called operator learning that aims at learning mappings between infinite-dimensional function spaces. However, current state-of-the-art architectures suffer from aliasing errors. We introduce a new framework, Representation Equivalent Neural Operators, which eliminates aliasing by establishing a continuous-discrete equivalence without compromising expressivity.
In the second part of the talk, we address a statistical machine learning problem: learning conditional distributions. We propose an operator learning framework utilizing the signature transform, reformulating the task as a convex optimization problem with provable statistical consistency guarantees.
Operator learning for solving PDEs and nonparametric regression
04.12.2024 14:00 - 14:45
Organiser:
SFB 65
Location:
TU Wien, green area, second floor, seminar room DA 02 A, Wiedner Hauptstr. 8, 1040 Wien
Verwandte Dateien
- pde_afternoon_2024-12-04.pdf 924 KB