Smoothed Analysis for Tensor Decompositions and Unsupervised Learning

Aravindan Vijayaraghavan
Northwestern University

Smoothed analysis is a powerful paradigm for proving robust, polynomial time guarantees of algorithms on worst-case computationally intractable problems. While polynomial time smoothed analysis guarantees are desirable for tensor decompositions and related problems in unsupervised learning, they have been challenging to obtain.

In this talk, I will describe a general framework for showing polynomial time smoothed analysis guarantees for tensor methods and related unsupervised learning problems. The core technical component is obtaining new high confidence lower bounds on the least singular value for a broad class of random matrix ensembles with highly dependent entries, that are given by low-degree polynomials of a few base underlying random variables. I will describe how these can be used to obtain polynomial time smoothed analysis guarantees for robust decompositions of symmetric order-(2t) overcomplete tensors (rank up to O(n^t)), learning overcomplete latent variable models like HMMs and other problems.

Based on joint work with Aditya Bhaskara, Aidao Chen and Aidan Perreault.

Presentation (PDF File)

Back to Long Programs