Concentration in unbounded metric spaces and algorithmic stability

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    10 Scopus citations

    Abstract

    We prove an extension of McDiarmid's inequality for metric spaces with unbounded diame-ter. To this end, we introduce the notion of the subgaussian diameter, which is a distribution- dependent refinement of the metric diameter. Our technique provides an alternative approach to that of Kutin and Niyogi's method of weakly difference-bounded functions, and yields non- trivial, dimension-free results in some interesting cases where the former does not. As an application, we give apparently the first generalization bound in the algorithmic stability setting that holds for unbounded loss functions. This yields a novel risk bound for some regularized metric regression algorithms. We give two extensions of the basic concentration result. The first enables one to replace the independence assumption by appropriate strong mixing. The second generalizes the subgaussian technique to other Orlicz norms.

    Original languageEnglish
    Title of host publication31st International Conference on Machine Learning, ICML 2014
    PublisherInternational Machine Learning Society (IMLS)
    Pages1185-1195
    Number of pages11
    Volume2
    ISBN (Electronic)9781634393973
    StatePublished - 1 Jan 2014
    Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
    Duration: 21 Jun 201426 Jun 2014

    Conference

    Conference31st International Conference on Machine Learning, ICML 2014
    Country/TerritoryChina
    CityBeijing
    Period21/06/1426/06/14

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Networks and Communications
    • Software

    Fingerprint

    Dive into the research topics of 'Concentration in unbounded metric spaces and algorithmic stability'. Together they form a unique fingerprint.

    Cite this