Functional Analysis of Neural Networks

Researchers: Yury Korolev, Carola-Bibiane Schönlieb

Universal approximation properties of various types of neural networks have been known since the late 1980’s. However, it has also been shown that the approximation rates in terms of the number of neurons scale exponentially with the dimension of the input space. However, certain types of functions can be approximated with dimension-independent Monte-Carlo rates. The functional-analytic study of the spaces of such functions has recently become an active area of research.

The existence of dimension-independent rates suggests that such rates can also be obtained in the infinite-dimensional setting, that is, when both the input space and output space are infinite-dimensional Hilbert or Banach spaces. For “classical” machine learning methods such as random feature models (a particular instance of kernel methods) such problems have been studied since the mid 2000’s. For neural networks (even for “shallow” neural networks with one hidden layer) such results seem to be emerging only now.

The motivation for studying neural networks as nonlinear operators between infinite-dimensional spaces, rather than functions between Euclidean spaces (even if these are high-dimensional), comes from inherently infinite-dimensional applications such as inverse problems and imaging. The generalisation from high but finite dimension to infinite dimension is far from being trivial and often requires advanced functional-analytic techniques.

The goal of this project is to advance the understanding of neural networks in the infinite-dimensional setting and to use this understanding to construct more stable and efficient numerical algorithms.

Who's involved

Software