A geometrical framework for covariance matrices and covariance operators in machine learning and applications

Minh Ha Quang
RIKEN
Center for Advanced Intelligence Project

Symmetric Positive Definite (SPD) matrices, in particular covariance matrices, play an important role in numerous areas of mathematics, statistics, machine learning, and applications in computer vision, brain imaging etc. Since the set of SPD matrices is not a subset of Euclidean space, a lot of recent research has focused on exploiting the rich non-Euclidean structures of SPD matrices to improve performance in practical applications.This talk will present some of the recent developments in the generalization of the geometrical structures of finite-dimensional covariance matrices to the setting of infinite-dimensional covariance operators, particularly reproducing kernel Hilbert space (RKHS) covariance operators.The theoretical results will be accompanied by numerical results in computer vision.

https://www.morganclaypool.com/doi/abs/10.2200/S00801ED1V01Y201709COV011


Back to Long Programs