TY - UNPB
T1 - Regression via Kirszbraun Extension with Applications to Imitation Learning.
AU - Biess, Armin
AU - Kontorovich, Aryeh
AU - Makarychev, Yury
AU - Zaichyk, Hanan
N1 - DBLP License: DBLP's bibliographic metadata records provided through http://dblp.org/ are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions.
PY - 2019
Y1 - 2019
N2 - We present a framework for performing regression between two Hilbert spaces. We accomplish this via Kirszbraun's extension theorem -- apparently the first application of this technique to supervised learning -- and analyze its statistical and computational aspects. We begin by formulating the correspondence problem in terms of quadratically constrained quadratic program (QCQP) regression. Then we describe a procedure for smoothing the training data, which amounts to regularizing hypothesis complexity via its Lipschitz constant. The Lipschitz constant is tuned via a Structural Risk Minimization (SRM) procedure, based on the covering-number risk bounds we derive. We apply our technique to learn a transformation between two robotic manipulators with different embodiments, and report promising results.
AB - We present a framework for performing regression between two Hilbert spaces. We accomplish this via Kirszbraun's extension theorem -- apparently the first application of this technique to supervised learning -- and analyze its statistical and computational aspects. We begin by formulating the correspondence problem in terms of quadratically constrained quadratic program (QCQP) regression. Then we describe a procedure for smoothing the training data, which amounts to regularizing hypothesis complexity via its Lipschitz constant. The Lipschitz constant is tuned via a Structural Risk Minimization (SRM) procedure, based on the covering-number risk bounds we derive. We apply our technique to learn a transformation between two robotic manipulators with different embodiments, and report promising results.
M3 - Preprint
BT - Regression via Kirszbraun Extension with Applications to Imitation Learning.
ER -