Stable Sample Compression Schemes: New Applications and an Optimal SVM Margin Bound

Steve Hanneke, Aryeh Kontorovich

Research output: Contribution to journalConference articlepeer-review

16 Scopus citations

Abstract

We analyze a family of supervised learning algorithms based on sample compression schemes that are stable, in the sense that removing points from the training set which were not selected for the compression set does not alter the resulting classifier. We use this technique to derive a variety of novel or improved data-dependent generalization bounds for several learning algorithms. In particular, we prove a new margin bound for SVM, removing a log factor. The new bound is provably optimal. This resolves a long-standing open question about the PAC margin bounds achievable by SVM.

Original languageEnglish
Pages (from-to)697-721
Number of pages25
JournalProceedings of Machine Learning Research
Volume132
StatePublished - 1 Jan 2021
Event32nd International Conference on Algorithmic Learning Theory, ALT 2021 - Virtual, Online
Duration: 16 Mar 202119 Mar 2021

Keywords

  • margin
  • sample compression
  • support vector machines

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Stable Sample Compression Schemes: New Applications and an Optimal SVM Margin Bound'. Together they form a unique fingerprint.

Cite this