On quantum backpropagation and information reuse

Amira Abbas
University of Amsterdam

The success of modern deep learning hinges on the ability to train neural networks at scale. Through clever reuse of intermediate information, backpropagation facilitates training through gradient computation at a total cost roughly proportional to running the function, rather than incurring an additional factor proportional to the number of parameters - which can now be in the trillions. Naively, one expects that quantum measurement collapse entirely rules out the reuse of quantum information as in backpropagation. But recent developments in shadow tomography, which assumes access to multiple copies of a quantum state, have challenged that notion. In this talk, we will investigate the feasibility of achieving backpropagation scaling for parameterized quantum models, which is essential for their use at scale.

Back to Workshop II: Mathematical Aspects of Quantum Learning