Efficient Kirszbraun extension with applications to regression

Hananel Zaichyk, Armin Biess, Aryeh Kontorovich, Yury Makarychev

Research output: Contribution to journalArticlepeer-review

Abstract

We introduce a framework for performing vector-valued regression in finite-dimensional Hilbert spaces. Using Lipschitz smoothness as our regularizer, we leverage Kirszbraun’s extension theorem for off-data prediction. We analyze the statistical and computational aspects of this method—to our knowledge, its first application to supervised learning. We decompose this task into two stages: training (which corresponds operationally to smoothing/regularization) and prediction (which is achieved via Kirszbraun extension). Both are solved algorithmically via a novel multiplicative weight updates (MWU) scheme, which, for our problem formulation, achieves significant runtime speedups over generic interior point methods. Our empirical results indicate a dramatic advantage over standard off-the-shelf solvers in our regression setting.

Original languageEnglish
Pages (from-to)617-642
Number of pages26
JournalMathematical Programming
Volume207
Issue number1-2
DOIs
StatePublished - 1 Sep 2024

Keywords

  • 62J02
  • 65K05
  • 90C20
  • Convex optimization
  • Kirszbraun extension
  • Quadratically constrained quadratic program
  • Regression

ASJC Scopus subject areas

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Efficient Kirszbraun extension with applications to regression'. Together they form a unique fingerprint.

Cite this