Randomized Algorithms for Rounding and Rank Compression in the Tensor Train Format

Paul Cazeaux
Virginia Polytechnic Institute and State University
Department of Mathematics

The Tensor-Train (TT) or Matrix-Product States (MPS) format stands out as a highly compact, low-rank representation for high-dimensional tensors, with numerous applications including the computation of many-body ground states with limited entanglement in spin models of condensed matter physics or quantum chemistry. The efficiency of this format hinges on the rounding operation, a process that trims the internal ranks of tensors to ensure feasible computational times and memory usage.
In this talk, we introduce a suite of randomized algorithms specifically designed for the TT rounding task, building upon and expanding existing randomized low-rank matrix approximation methodologies. These algorithms not only preserve the TT format’s integrity but also provide a considerable computational advantage, particularly evident when rounding a sum of TT-tensors or the result of a matrix-vector product—a frequent bottleneck in numerous applications. We achieve up to a 20× speedup, markedly boosting the performance of classical iterative Krylov methods such as GMRES and Lanczos when applied to vectors in TT format. This enhancement allows their use as viable alternatives to Alternating Least-Squares (ALS) or (DMRG) in certain scenarios. We present and discuss the randomized algorithms, providing a comparative analysis of their empirical accuracy and computational efficiency against deterministic counterparts.

Back to Workshop III: Many-body Quantum Systems via Classical and Quantum Computation