Exact lower bounds for the agnostic probably-approximately-correct (PAC) machine learning model

    Research output: Contribution to journalArticlepeer-review

    14 Scopus citations

    Abstract

    We provide an exact nonasymptotic lower bound on the minimax expected excess risk (EER) in the agnostic probably-approximately-correct (PAC) machine learning classification model and identify minimax learning algorithms as certain maximally symmetric and minimally randomized "voting" procedures. Based on this result, an exact asymptotic lower bound on the minimax EER is provided. This bound is of the simple form c∞/√ν as ν→∞, where c∞ = 0.16997 . . . is a universal constant, ν = m/d, m is the size of the training sample and d is the Vapnik-Chervonenkis dimension of the hypothesis class. It is shown that the differences between these asymptotic and nonasymptotic bounds, as well as the differences between these two bounds and the maximum EER of any learning algorithms that minimize the empirical risk, are asymptotically negligible, and all these differences are due to ties in the mentioned "voting" procedures. A few easy to compute nonasymptotic lower bounds on the minimax EER are also obtained, which are shown to be close to the exact asymptotic lower bound c∞/√ν even for rather small values of the ratio ν = m/d. As an application of these results, we substantially improve existing lower bounds on the tail probability of the excess risk. Among the tools used are Bayes estimation and apparently new identities and inequalities for binomial distributions.

    Original languageEnglish
    Pages (from-to)2822-2854
    Number of pages33
    JournalAnnals of Statistics
    Volume47
    Issue number5
    DOIs
    StatePublished - 1 Jan 2019

    Keywords

    • Bayes decision rules
    • Binomial distribution
    • Classification
    • Empirical estimators
    • Generalization error
    • Minimax decision rules
    • PAC learning theory

    ASJC Scopus subject areas

    • Statistics and Probability
    • Statistics, Probability and Uncertainty

    Fingerprint

    Dive into the research topics of 'Exact lower bounds for the agnostic probably-approximately-correct (PAC) machine learning model'. Together they form a unique fingerprint.

    Cite this