Distribution-dependent sample complexity of large margin learning

Sivan Sabato, Nathan Srebro, Naftali Tishby

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L2 regularization: We introduce the margin-adapted dimension, which is a simple function of the second order statistics of the data distribution, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the margin-adapted dimension of the data distribution. The upper bounds are universal, and the lower bounds hold for the rich family of sub-Gaussian distributions with independent features. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. To prove the lower bound, we develop several new tools of independent interest. These include new connections between shattering and hardness of learning, new properties of shattering with linear classifiers, and a new lower bound on the smallest eigenvalue of a random Gram matrix generated by sub-Gaussian variables. Our results can be used to quantitatively compare large margin learning to other learning rules, and to improve the effectiveness of methods that use sample complexity bounds, such as active learning.

Original languageEnglish
Pages (from-to)2119-2149
Number of pages31
JournalJournal of Machine Learning Research
Volume14
StatePublished - 1 Jun 2013
Externally publishedYes

Keywords

  • Distribution-dependence
  • Linear classifiers
  • Sample complexity
  • Supervised learning

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Distribution-dependent sample complexity of large margin learning'. Together they form a unique fingerprint.

Cite this