Manifold models offer an intrinsically low-dimensional representation for many classes of signals that are extrinsically high-dimensional. Such models, however, are nonlinear, and thus there is interest in reducing the dimension of such signals through compressive linear mappings while preserving the structure of the signal family. A famous theorem by Hassler Whitney confirms that such linear mappings exist with an embedding dimension proportional to the manifold dimension. However, Whitney's theorem does not guarantee the stability (i.e., near isometry) of such an embedding. In this talk, I will explain how nearly isometric embeddings can be achieved using random linear mappings with embedding dimension proportional to the manifold dimension. I will discuss applications of this result in compressive sensing. I will also present an extension of this result for embedding the attractors of certain dynamical systems via delay coordinate embeddings constructed from time series observations. Finally, I will draw parallels to understanding the non-convex landscape of empirical risk minimization problems in machine learning.
Back to Computational Microscopy