On the Value of Partial Information for Learning from Examples

Joel Ratsaby, Vitaly Maiorov

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

The PAC model of learning and its extension to real valued function classes provides a well-accepted theoretical framework for representing the problem of learning a target functiong(x) using a random sample {(xi,g(xi))}i=1 m. Based on the uniform strong law of large numbers the PAC model establishes the sample complexity, i.e., the sample sizemwhich is sufficient for accurately estimating the target function to within high confidence. Often, in addition to a random sample, some form of prior knowledge is available about the target. It is intuitive that increasing the amount of information should have the same effect on the error as increasing the sample size. But quantitatively how does the rate of error with respect to increasing information compare to the rate of error with increasing sample size? To answer this we consider a new approach based on a combination of information-based complexity of Traubet al.and Vapnik-Chervonenkis (VC) theory. In contrast to VC-theory where function classes of finite pseudo-dimension are used only for statistical-based estimation, we let such classes play a dual role of functional estimation as well as approximation. This is captured in a newly introduced quantity, ρd(F), which represents a nonlinear width of a function class F. We then extend the notion of thenth minimal radius of information and define a quantityIn,d(F) which measures the minimal approximation error of the worst-case targetg∈ F by the family of function classes having pseudo-dimensiondgiven partial information ongconsisting of values taken bynlinear operators. The error rates are calculated which leads to a quantitative notion of the value of partial information for the paradigm of learning from examples.

Original languageEnglish
Pages (from-to)509-544
Number of pages36
JournalJournal of Complexity
Volume13
Issue number4
DOIs
StatePublished - 1 Jan 1997
Externally publishedYes

ASJC Scopus subject areas

  • Algebra and Number Theory
  • Statistics and Probability
  • Numerical Analysis
  • General Mathematics
  • Control and Optimization
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'On the Value of Partial Information for Learning from Examples'. Together they form a unique fingerprint.

Cite this