Abstract
We present a differentially private learner for halfspaces over a finite grid G in Rd with sample complexity ˜ d2.5 · 2log* |G|, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a d2 factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of m linear constraints of the form Ax = b, the task is to privately identify a solution x that satisfies most of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution x.
Original language | English |
---|---|
Journal | Advances in Neural Information Processing Systems |
Volume | 2020-December |
State | Published - 1 Jan 2020 |
Event | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online Duration: 6 Dec 2020 → 12 Dec 2020 |
ASJC Scopus subject areas
- Computer Networks and Communications
- Information Systems
- Signal Processing