Private learning of halfspaces: Simplifying the construction & reducing the sample complexity

Haim Kaplan, Yishay Mansour, Uri Stemmer, Eliad Tsfadia

Research output: Contribution to journalConference articlepeer-review

10 Scopus citations

Abstract

We present a differentially private learner for halfspaces over a finite grid G in Rd with sample complexity ˜ d2.5 · 2log* |G|, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a d2 factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of m linear constraints of the form Ax = b, the task is to privately identify a solution x that satisfies most of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution x.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume2020-December
StatePublished - 1 Jan 2020
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Private learning of halfspaces: Simplifying the construction & reducing the sample complexity'. Together they form a unique fingerprint.

Cite this