TY - GEN
T1 - On the Sample Complexity of Privately Learning Axis-Aligned Rectangles
AU - Sadigurschi, Menachem
AU - Stemmer, Uri
N1 - Publisher Copyright:
© 2021 Neural information processing systems foundation. All rights reserved.
PY - 2021/1/1
Y1 - 2021/1/1
N2 - We revisit the fundamental problem of learning Axis-Aligned-Rectangles over a finite grid Xd ⊆ Rd with differential privacy. Existing results show that the sample complexity of this problem is at most min { d· log |X|, d1.5· (log∗ |X|)1.5}. That is, existing constructions either require sample complexity that grows linearly with log |X|, or else it grows super linearly with the dimension d. We present a novel algorithm that reduces the sample complexity to only Õ ( d· (log∗ |X|)1.5), attaining a dimensionality optimal dependency without requiring the sample complexity to grow with log |X|. The technique used in order to attain this improvement involves the deletion of “exposed” data-points on the go, in a fashion designed to avoid the cost of the adaptive composition theorems. The core of this technique may be of individual interest, introducing a new method for constructing statistically-efficient private algorithms.
AB - We revisit the fundamental problem of learning Axis-Aligned-Rectangles over a finite grid Xd ⊆ Rd with differential privacy. Existing results show that the sample complexity of this problem is at most min { d· log |X|, d1.5· (log∗ |X|)1.5}. That is, existing constructions either require sample complexity that grows linearly with log |X|, or else it grows super linearly with the dimension d. We present a novel algorithm that reduces the sample complexity to only Õ ( d· (log∗ |X|)1.5), attaining a dimensionality optimal dependency without requiring the sample complexity to grow with log |X|. The technique used in order to attain this improvement involves the deletion of “exposed” data-points on the go, in a fashion designed to avoid the cost of the adaptive composition theorems. The core of this technique may be of individual interest, introducing a new method for constructing statistically-efficient private algorithms.
UR - http://www.scopus.com/inward/record.url?scp=85131882514&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85131882514
T3 - Advances in Neural Information Processing Systems
SP - 28286
EP - 28297
BT - Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
A2 - Ranzato, Marc'Aurelio
A2 - Beygelzimer, Alina
A2 - Dauphin, Yann
A2 - Liang, Percy S.
A2 - Wortman Vaughan, Jenn
PB - Neural information processing systems foundation
T2 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
Y2 - 6 December 2021 through 14 December 2021
ER -