School of Technology and Computer Science Seminars

Low-rank Matrix and Tensor Recovery: Theory and Algorithms

by Prof. Donald Goldfarb (Columbia University, USA)

Monday, March 3, 2014 from to (Asia/Kolkata)
at Colaba Campus ( AG-66 (Lecture Theatre) )
Description
Recovering a low-rank matrix or tensor from incomplete or corrupted observations is a recurring problem in signal processing and machine learning. To exploit the structure of data that is intrinsically more than three-dimensional, convex models such low-rank completion and robust principal component analysis (RPCA) for matrices have been extended to tensors. In this talk, we rigorously establish recovery guarantees for both tensor completion and tensor RPCA. We demonstrate that using the most popular convex relaxation for the tensor Tucker rank can be substantially suboptimal in terms of the number of observations needed for exact recovery. We introduce a very simple, new convex relaxation which is shown be much better, both theoretically and empirically. Moreover, we propose algorithms to solve both low-rank matrix and tensor recovery models based on the Alternating Direction Augmented Lagrangian (ADAL), Frank-Wolfe and prox-gradient methods. Finally, we empirically investigate the recoverability properties of these convex models, and the computational performance of our algorithms on both simulated and real data (This is joint work with Cun Mu, Bo Huang and Tony Qin (IEOR PhD students at Columbia University) and John Wright (E.E. faculty member at Columbia University)).