Tight sample complexity of large-margin learning

Sivan Sabato, Nathan Srebro, Naftali Tishby

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L2 regularization: We introduce the γ-adapted-dimension, which is a simple function of the spectrum of a distribution's covariance matrix, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the γ-adapted-dimension of the source distribution. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. The bounds hold for a rich family of sub-Gaussian distributions.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 23
Subtitle of host publication24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
PublisherNeural Information Processing Systems
ISBN (Print)9781617823800
StatePublished - 1 Jan 2010
Externally publishedYes
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: 6 Dec 20109 Dec 2010

Conference

Conference24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Country/TerritoryCanada
CityVancouver, BC
Period6/12/109/12/10

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'Tight sample complexity of large-margin learning'. Together they form a unique fingerprint.

Cite this