Information divergence in high dimensions

Alfred Hero
University of Michigan
EECS

Information theoretic measures have application to quantifying data
complexity (entropy), finding dependencies in multivariate data (mutual
information), and comparing two or more data samples (relative entropy).
However, accurate estimation and computation of such measures presents
challenges for high dimensional data spaces. We will present methods for
approximating a broad class of Renyi alpha divergences based on minimal
graph constructions such as k nearest neighbor graphs (kNNG) and minimal
spanning trees (MST) over the set of data vectors. We will discuss
several applications of alpha divergence estimates including: estimation
of embedding dimension, data compression on manifolds, image
registration, and discovery of dependency networks.

Presentation (PDF File)

Back to MGA Workshop III: Multiscale structures in the analysis of High-Dimensional Data