Recently, the Riemannian geometry of the space of symmetric positive-definite matrices has become a central ingredient in a broad range of data analysis and learning tools. Broadly, it enables to observe complex high-dimensional data through the lens of objects with a known non-Euclidean geometry. In this talk, I will present a method for domain adaptation using parallel transport on the cone manifold of symmetric positive-definite matrices. Using Riemannian geometry, I will show the theoretical guarantees and discuss the benefits of the presented method. In addition, I will demonstrate these benefits on simulations as well as on real-measured data. While the majority of the talk will be focused on the particular symmetric positive-definite covariance matrices, I will present generalizations to kernels, diffusion operators, and graph-Laplacians and show intriguing properties using spectral analysis with application to source separation and multimodal data fusion.
* Joint work with Or Yair, Ori Katz, and Miri Ben-Chen