Skip to main navigation Skip to search Skip to main content

Adaptive metric dimensionality reduction

    Research output: Contribution to journalArticlepeer-review

    36 Scopus citations

    Abstract

    We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverages the dual benefits of low dimensionality: (1) more efficient algorithms, e.g., for proximity search, and (2) more optimistic generalization bounds.

    Original languageEnglish
    Pages (from-to)105-118
    Number of pages14
    JournalTheoretical Computer Science
    Volume620
    DOIs
    StatePublished - 21 Mar 2016

    Keywords

    • Dimensionality reduction
    • Doubling dimension
    • Metric space
    • PCA
    • Rademacher complexity

    ASJC Scopus subject areas

    • Theoretical Computer Science
    • General Computer Science

    Fingerprint

    Dive into the research topics of 'Adaptive metric dimensionality reduction'. Together they form a unique fingerprint.

    Cite this