Speaker
Professor Anders Karlsson, Université de Genéve/Uppsala Universitet
Abstract:
Invertible real 2×2 matrices act via Möbius transformations on the upper half plane preserving the hyperbolic distance. This generalizes to all dimensions and to many nonlinear contexts, for example surface homeomorphisms inducing isometries of the Teichmüller spaces, and deep learning. In joint works with F. Ledrappier and with S. Gouëzel we have established a multiplicative ergodic theorem that applies to such metric settings (containing Oseledets’ theorem and Thurston’s spectral theorem for surface homeomorphisms as special cases). Maps that do not increase distances also appear in machine learning, and random products of such maps appear in the training, initialization and regularization of deep neural networks. I will describe the metric approach and highlight a subadditive ergodic statement that is useful independently. Our approach is relevant for dynamics in infinite dimensional spaces and random dynamical systems which are two settings that have recently been emphasized and advanced in works by L.-S. Young.