TY - GEN
T1 - Differentially Private Sinkhorn Algorithm
AU - Wang, Jiaqi
AU - Goldfeld, Ziv
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024/1/1
Y1 - 2024/1/1
N2 - Optimal transport (OT) theory play a central role in the design and analysis of various machine learning algorithms. As such, approximate computation of the OT cost between large-scale dataset via the popular Sinkhorn algorithms forms a basic primitive. However, this approach may lead to privacy violations when dealing with datasets that contain sensitive information. To address this predicament, we propose a differentially private variant of the Sinkhorn algorithm and couple it with formal guarantees by deriving its privacy utility tradeoff (PUT). To that end, the Sinkhorn algorithm is treated as a block coordinate descent algorithm scheme, which we privatize by injecting Gaussian noise to the iterates. We establish a linear convergence rate for our private Sinkhorn algorithm and analyze its privacy by controlling the Rényi divergence between outputs corresponding to neighboring input dataset. Combining these results we obtain the desired PUT. In doing so, this work also closes an existing gap in formal guarantees for private constrained nonlinear optimization. As an application, we employ the noisy Sinkhorn algorithm for differentially private (approximate) computation of OT cost and derive insights from its PUT.
AB - Optimal transport (OT) theory play a central role in the design and analysis of various machine learning algorithms. As such, approximate computation of the OT cost between large-scale dataset via the popular Sinkhorn algorithms forms a basic primitive. However, this approach may lead to privacy violations when dealing with datasets that contain sensitive information. To address this predicament, we propose a differentially private variant of the Sinkhorn algorithm and couple it with formal guarantees by deriving its privacy utility tradeoff (PUT). To that end, the Sinkhorn algorithm is treated as a block coordinate descent algorithm scheme, which we privatize by injecting Gaussian noise to the iterates. We establish a linear convergence rate for our private Sinkhorn algorithm and analyze its privacy by controlling the Rényi divergence between outputs corresponding to neighboring input dataset. Combining these results we obtain the desired PUT. In doing so, this work also closes an existing gap in formal guarantees for private constrained nonlinear optimization. As an application, we employ the noisy Sinkhorn algorithm for differentially private (approximate) computation of OT cost and derive insights from its PUT.
UR - http://www.scopus.com/inward/record.url?scp=85211098890&partnerID=8YFLogxK
U2 - 10.1109/Allerton63246.2024.10735319
DO - 10.1109/Allerton63246.2024.10735319
M3 - Conference contribution
AN - SCOPUS:85211098890
T3 - 2024 60th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2024
BT - 2024 60th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2024
PB - Institute of Electrical and Electronics Engineers
T2 - 60th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2024
Y2 - 24 September 2024 through 27 September 2024
ER -