## Abstract

In the standard agnostic multiclass model, <instance, label > pairs are sampled independently from some underlying distribution. This distribution induces a conditional probability over the labels given an instance, and our goal in this paper is to learn this conditional distribution. Since even unconditional densities are quite challenging to learn, we give our learner access to <instance, conditional distribution > pairs. Assuming a base learner oracle in this model, we might seek a boosting algorithm for constructing a strong learner. Unfortunately, without further assumptions, this is provably impossible. However, we give a new boosting algorithm that succeeds in the following sense: given a base learner guaranteed to achieve some average accuracy (i.e., risk), we efficiently construct a learner that achieves the same level of accuracy with arbitrarily high probability. We give generalization guarantees of several different kinds, including distribution-free accuracy and risk bounds. None of our estimates depend on the number of boosting rounds and some of them admit dimension-free formulations.

Original language | English |
---|---|

Pages (from-to) | 129-144 |

Number of pages | 16 |

Journal | Annals of Mathematics and Artificial Intelligence |

Volume | 79 |

Issue number | 1-3 |

DOIs | |

State | Published - 1 Mar 2017 |

## Keywords

- Boosting
- Conditional density

## ASJC Scopus subject areas

- Artificial Intelligence
- Applied Mathematics