Faster gradient descent convergence for matrix factorization using unbalanced initialization

Rachel Ward
University of Texas at Austin
Mathematics

We present some recent results with improved convergence rates for gradient descent on matrix factorization problems, made possible by using an unbalanced initialization of the weights. Applications to neural networks and fine-tuning will be discussed.