Approximation and learning with tree tensor networks

Anthony Nouy
Université de Nantes

Tree tensor networks (TTNs), or tree-based tensor formats, are prominent model classes for the approximation of high-dimensional functions in computational and data science.
In a first part of the talk, after an introduction to approximation tools based on tensorization and TTNs, we introduce their approximation classes and present some results on their properties. In particular, we show that a wide range of smoothness (Besov) spaces are continuously embedded in TTNs approximation classes. For such spaces, TTNs achieve (near to) optimal rates that are usually achieved by classical approximation tools, but without requiring to adapt the tool to the regularity of the function. The use of deep networks with free depth is shown to be essential for obtaining this property. Also, it is shown that exploiting sparsity of tensors allows to obtain optimal rates achieved by classical nonlinear approximation tools, or to better exploit structured smoothness (anisotropic or mixed) for multivariate approximation.
We also show that approximation classes of tensor networks are not contained in any Besov space. That reveals again the importance of depth and the potential of tensor networks to achieve approximation or learning tasks for functions beyond standard regularity classes.
In a second part, we discuss learning aspects in a statistical setting.
In an empirical risk minimization framework with a limited number of observations, the feature tensor space, dimension tree and ranks should be selected carefully to balance estimation and approximation errors. We present a model selection strategy (à la Barron, Birgé, Massart), with a suitable choice of penalty derived from complexity estimates for TTNs. In a least-squares setting, this approach is proved to be minimax adaptive for a wide range of smoothness classes.

[1] M. Ali and A. Nouy. Approximation with Tensor Networks. Part I: Approximation Spaces. arXiv:2007.00118
[2] M. Ali and A. Nouy. Approximation with Tensor Networks. Part II: Approximation Rates for Smoothness Classes. arXiv:2007.00128
[3] M. Ali and A. Nouy. Approximation with Tensor Networks. Part III: Multivariate approximation. arXiv:2101.11932
[4] B. Michel and A. Nouy. Learning with tree tensor networks: complexity estimates and model selection. arXiv:2007.01165.

Presentation (PDF File)

Back to Workshop I: Tensor Methods and their Applications in the Physical and Data Sciences