Abstract
A new line of work, starting with Dwork et al. (STOC, 2015), demonstrateshow di erential privacy can be used as a mathematical tool for guaranteeing generalizationin adaptive data analysis. Speci cally, if a di erentially private analysis is applied on a sample S of i.i.d. examples to select a low-sensitivity function f, then w.h.p. f(S) is close to its expectation, even though f is being chosen adaptively, i.e., based on the data .Very recently, Steinke and Ullman observed that these generalization guarantees can be used for proving concentration bounds in the non-adaptive setting, where the low-sensitivity function is fixed beforehand. In particular, they obtain alternative proofs for classical concentration bounds for low-sensitivity functions, such as the Cherno bound and McDiarmid's Inequality. In this work, we extend this connection between differential privacy and concentration bounds, and show that differential privacy can be used to prove concentration of functions that are not low-sensitivity.
Original language | English GB |
---|---|
Pages (from-to) | 1-33 |
Journal | Journal of Privacy and Confidentiality |
Volume | 9 |
Issue number | 1 |
DOIs | |
State | Published - 2019 |