Non-parametric Binary regression in metric spaces with KL loss.

Ariel Avital, Klim Efremenko, Aryeh Kontorovich, David Toplin, Bo Waggoner

Research output: Working paper/PreprintPreprint


We propose a non-parametric variant of binary regression, where the hypothesis is regularized to be a Lipschitz function taking a metric space to [0, 1] and the loss is logarithmic. This setting presents novel computational and statistical challenges. On the computational front, we derive a novel efficient optimization algorithm based on interior point methods; an attractive feature is that it is parameter-free (i.e., does not require tuning an update step size). On the statistical front, the unbounded loss function presents a problem for classic generalization bounds, based on covering-number and Rademacher techniques. We get around this challenge via an adaptive truncation approach, and also present a lower bound
indicating that the truncation is, in some sense, necessary.
Original languageEnglish
StatePublished - 19 Oct 2020


Dive into the research topics of 'Non-parametric Binary regression in metric spaces with KL loss.'. Together they form a unique fingerprint.

Cite this