Workshop IV: Efficient Tensor Representations for Learning and Computational Complexity

May 17 - 21, 2021

Overview

tensorsTensors are well-suited to capture higher order correlations or complex relations in data. Unfortunately, the number of parameters describing a tensor scales exponentially with its order. Naive tensor estimation methods would thus require an impractical amount of samples. To counter this problem, a number of efficient tensor representations have been introduced. These include low-rank decompositions which capture latent structures, or tensor networks that are tailored to quantum many-body systems with local interactions. The first emphasis of this workshop will be on the theory of recovering efficient tensor representations from empirical data, as studied e.g. in the context of low-rank tensor completion or matrix-product state learning. We will focus both on algorithmic and on statistical aspects.

In addition to describing data, tensors can also represent computational problems, such as the problem of multiplying large matrices or the evaluation of permanents. In this context, low-rank decompositions correspond to efficient algorithms, while the non-existence of such decompositions amounts to lower bounds. The second emphasis of the workshop will thus be on applications of efficient tensor representations to theoretical computer science, particularly computational complexity theory. A closely related area that will also be covered is the resource theory of tensors in quantum information theory.

This workshop will include a poster session; a request for posters will be sent to registered participants in advance of the workshop.

Organizing Committee

Anima Anandkumar (California Institute of Technology)
Fernando Brandao (California Institute of Technology)
Rong Ge (Duke University)
David Gross (Universität zu Köln)
Michael Walter (Universiteit van Amsterdam)
Ming Yuan (Columbia University)