Adaptive metric dimensionality reduction

Lee Ad Gottlieb, Aryeh Kontorovich, Robert Krauthgamer

Research output: Contribution to journalArticlepeer-review

24 Scopus citations


We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverages the dual benefits of low dimensionality: (1) more efficient algorithms, e.g., for proximity search, and (2) more optimistic generalization bounds.

Original languageEnglish
Pages (from-to)105-118
Number of pages14
JournalTheoretical Computer Science
StatePublished - 21 Mar 2016


  • Dimensionality reduction
  • Doubling dimension
  • Metric space
  • PCA
  • Rademacher complexity

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science


Dive into the research topics of 'Adaptive metric dimensionality reduction'. Together they form a unique fingerprint.

Cite this