Agnostic Sample Compression Schemes for Regression

Idan Attias, Steve Hanneke, Aryeh Kontorovich, Menachem Sadigurschi

Research output: Contribution to journalConference articlepeer-review

Abstract

We obtain the first positive results for bounded sample compression in the agnostic regression setting with the ℓp loss, where p ∈ [1, ∞]. We construct a generic approximate sample compression scheme for real-valued function classes exhibiting exponential size in the fat-shattering dimension but independent of the sample size. Notably, for linear regression, an approximate compression of size linear in the dimension is constructed. Moreover, for ℓ1 and ℓ losses, we can even exhibit an efficient exact sample compression scheme of size linear in the dimension. We further show that for every other ℓp loss, p ∈ (1, ∞), there does not exist an exact agnostic compression scheme of bounded size. This refines and generalizes a negative result of David, Moran, and Yehudayoff (2016) for the ℓ2 loss. We close by posing general open questions: for agnostic regression with ℓ1 loss, does every function class admit an exact compression scheme of polynomial size in the pseudo-dimension? For the ℓ2 loss, does every function class admit an approximate compression scheme of polynomial size in the fat-shattering dimension? These questions generalize Warmuth's classic sample compression conjecture for realizable-case classification (Warmuth, 2003).

Original languageEnglish
Pages (from-to)2069-2085
Number of pages17
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 1 Jan 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Agnostic Sample Compression Schemes for Regression'. Together they form a unique fingerprint.

Cite this