Skip to main content

Loading Events

« All Events

  • This event has passed.

Computational and Applied Mathematics: Erik Bollt, Clarkson University, Next-Generation Reservoir Computing, and On Explaining the Surprising Success of a Random Neural Network for Forecasting Chaos

September 20 | 12:45 pm - 1:45 pm EDT

Machine learning has become a widely popular and successful paradigm, including for data-driven science. A major application is forecasting complex dynamical systems. Artificial neural networks (ANN) have evolved as a clear leading approach, and recurrent neural networks (RNN) are considered to be especially well suited. Reservoir computers (RC) have emerged for simplicity and computational advantages. Instead of a fully trained network, an RC trains only read-out weights. However, perhaps why and how an RC works at all, despite randomly selected weights is the surprise. We explicitly connect an RC with linear activation and linear read-out to well developed time-series literature on vector autoregressive averages (VAR), which already perform well for short term forecasts. Thus also follows the existence of the representation by the WOLD theorem. Even better, with a random network, linear activation and polynomial read-out RC, we explicitly connect to a nonlinear VAR (NVAR). This leads us to introduce a new best data-driven forecasting method that we call next generation reservoir computing, NG-RC. Further, we connect this random neural network approach to the now widely popular dynamic mode decomposition (DMD). Thus, these three are in a sense different faces of the same concept. Several examples will be shown.

Details

Date:
September 20
Time:
12:45 pm - 1:45 pm EDT
Event Category:

Venue

SAS 4201