Abstract
We introduce a framework for performing vector-valued regression in finite-dimensional Hilbert spaces. Using Lipschitz smoothness as our regularizer, we leverage Kirszbraun’s extension theorem for off-data prediction. We analyze the statistical and computational aspects of this method—to our knowledge, its first application to supervised learning. We decompose this task into two stages: training (which corresponds operationally to smoothing/regularization) and prediction (which is achieved via Kirszbraun extension). Both are solved algorithmically via a novel multiplicative weight updates (MWU) scheme, which, for our problem formulation, achieves significant runtime speedups over generic interior point methods. Our empirical results indicate a dramatic advantage over standard off-the-shelf solvers in our regression setting.
Original language | English |
---|---|
Pages (from-to) | 617-642 |
Number of pages | 26 |
Journal | Mathematical Programming |
Volume | 207 |
Issue number | 1-2 |
DOIs | |
State | Published - 1 Sep 2024 |
Keywords
- 62J02
- 65K05
- 90C20
- Convex optimization
- Kirszbraun extension
- Quadratically constrained quadratic program
- Regression
ASJC Scopus subject areas
- Software
- General Mathematics