Hyper-optimized tensor network contraction - simplifications, applications and approximations

Johnnie Gray
California Institute of Technology

Contracting tensor networks with potentially complex geometry is a useful task across many fields including quantum circuit simulation. The computational cost of this is extraordinarily sensitive to the so-called contraction tree, and here we describe a method of building these based on hyper-graph partitioning that appears to be close to optimal. Driven by this we also introduce a set of tensor network simplifications that aim to make contraction easier - and which turn out to surprisingly powerful on their own. Finally we touch on extending these ideas to approximate contraction.

Presentation (PDF File)

Back to Workshop I: Tensor Methods and their Applications in the Physical and Data Sciences