Universal Rates for Regression: Separations between Cut-Off and Absolute Loss

Idan Attias, Steve Hanneke, Alkis Kalavasis, Amin Karbasi, Grigoris Velegkas

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

In this work we initiate the study of regression in the universal rates framework of (Bousquet et al., 2021). Unlike the traditional uniform learning setting, we are interested in obtaining learning guarantees that hold for all fixed data-generating distributions, but do not hold uniformly across them. We focus on the realizable setting and we consider two different well-studied loss functions: the cut-off loss at scale γ > 0, which asks for predictions that are γ-close to the correct one, and the absolute loss, which measures how far away the prediction is from the correct one. Our results show that the landscape of the achievable rates in the two cases is completely different. First we give a trichotomic characterization of the optimal learning rates under the cut-off loss: each class is learnable either at an exponential rate, a (nearly) linear rate or requires arbitrarily slow rates. Moving to the absolute loss, we show that the achievable learning rates are significantly more involved by illustrating that an infinite number of different learning rates is achievable. This is the first time that such a rich landscape of rates is obtained in the universal rates literature.

Original languageEnglish
Pages (from-to)359-405
Number of pages47
JournalProceedings of Machine Learning Research
Volume247
StatePublished - 1 Jan 2024
Event37th Annual Conference on Learning Theory, COLT 2024 - Edmonton, Canada
Duration: 30 Jun 20243 Jul 2024

Keywords

  • Regression
  • Statistical Learning Theory
  • Universal Rates

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Universal Rates for Regression: Separations between Cut-Off and Absolute Loss'. Together they form a unique fingerprint.

Cite this