Agnostic Sample Compression for Linear Regression

Steve Hanneke, Aryeh Kontorovich, Menachem Sadigurschi

Research output: Contribution to journalArticlepeer-review


We obtain the first positive results for bounded sample compression in the agnostic regression setting. We show that for p in {1,infinity}, agnostic linear regression with ℓp loss admits a bounded sample compression scheme. Specifically, we exhibit efficient sample compression schemes for agnostic linear regression in Rd of size d+1 under the ℓ1 loss and size d+2 under the ℓ∞ loss. We further show that for every other ℓp loss (1 < p < infinity), there does not exist an agnostic compression scheme of bounded size. This refines and generalizes a negative result of David, Moran, and Yehudayoff (2016) for the ℓ2 loss. We close by posing a general open question: for agnostic regression with ℓ1 loss, does every function class admit a compression scheme of size equal to its pseudo-dimension? This question generalizes Warmuth's classic sample compression conjecture for realizable-case classification (Warmuth, 2003).
Original languageEnglish GB
Pages (from-to)1-22
JournalJournal of Machine Learning Research
StatePublished - 2019


Dive into the research topics of 'Agnostic Sample Compression for Linear Regression'. Together they form a unique fingerprint.

Cite this