LMA - Laboratoire de Mécanique et d’Acoustique

A. Nouy - Approximation with tensor networks


Le 16 mars 2021 de 11h00 à 12h00

Anthony Nouy, professeur à Centrale Nantes / Laboratoire Jean Leray

Tensor networks (TNs) are prominent model classes for the approximation of high-dimensional functions in computational and data science. Tree-based TNs based, also known as tree-based tensor formats, can be seen as particular feed-forward neural networks.
After an introduction to approximation tools based on tree TNs, we introduce their approximation classes and present some recent results on their properties.
In particular, we show that classical smoothness (Besov) spaces are continuously embedded in TNs approximation classes. For such spaces, TNs achieve (near to) optimal rate that is usually achieved by classical approximation tools, but without requiring to adapt the tool to the regularity of the function. The use of deep networks with free depth is shown to be essential for obtaining this property. Also, it is shown that exploiting sparsity of tensors allows to obtain optimal rates achieved by classical nonlinear approximation tools, or to better exploit structured smoothness (anisotropic or mixed) for multivariate approximation.
We also show that approximation classes of tensor networks are not contained in any Besov space, unless one restricts the depth of the tensor network. That reveals again the importance of depth and the potential of tensor networks to achieve approximation or learning tasks for functions beyond standard regularity classes.

References :

  1. M. Ali and A. Nouy. Approximation with Tensor Networks. Part I : Approximation Spaces. arXiv:2007.00118
  2. M. Ali and A. Nouy. Approximation with Tensor Networks. Part II : Approximation Rates for Smoothness Classes. arXiv:2007.00128
  3. M. Ali and A. Nouy. Approximation with Tensor Networks. Part III : Multivariate approximation. arXiv:2101.11932
  4. B. Michel and A. Nouy. Learning with tree tensor networks : complexity estimates and model selection. arXiv:2007.01165.

Voir en ligne : plus de détails concernant l’orateur