On the complexity of samples for learning

Joel Ratsaby

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

In machine-learning, maximizing the sample margin can reduce the learning generalization-error. Thus samples on which the target function has a large margin (γ) convey more information so we expect fewer such samples. In this paper, we estimate the complexity of a class of sets of large-margin samples for a general learning problem over a finite domain. We obtain an explicit dependence of this complexity on γ and the sample size.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
EditorsKyung-Yong Chwa, J. Ian Munro
PublisherSpringer Verlag
Pages198-209
Number of pages12
ISBN (Electronic)354022856X, 9783540228561
DOIs
StatePublished - 1 Jan 2004
Externally publishedYes

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3106
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'On the complexity of samples for learning'. Together they form a unique fingerprint.

Cite this