Many dynamical problems in engineering (including financial), control theory, signal processing, time series analysis and forecasting can be described using input/output (IO) systems. State-space systems are known to provide a parsimonious and computationally efficient way to model the relation between time evolving explanatory variables (the input) and a collection of dependent or explained variables of interest (the output). Whenever a true functional IO relation cannot be derived from first principles, various classes of state-space systems can be used as universal approximants. We shall show that particular families of such state-space systems, the so-called Reservoir Computing (RC) systems, with extremely simple and easy to implement architectures, enjoy universal approximation properties which have been proved in different setups. The defining feature of RC systems is the fact that some their components (usually the state map) are randomly generated and the observation equation has an easily tractable form. From the machine learning perspective, RC systems can be seen as recurrent neural networks with randomly generated and non-trainable weights and a simple-to-train readout layer (often a linear map). RC systems serve as efficient, randomized, online computational tools for dynamic processes and enjoy generalization properties which can be explicitly derived. We will make a general introduction into up-to-date theoretical developments, will discuss connections with research contributions in other fields, and will address details of applications of RC systems for data processing.
Learning Dynamic Processes with Reservoir Computing
21.01.2021 14:50 - 15:35
Organiser:
Fakultät für Mathematik
Location:
Online via Zoom