TY - GEN
T1 - Private approximation of search problems
AU - Beimel, Amos
AU - Carmi, Paz
AU - Nissim, Kobbi
AU - Weinreb, Enav
PY - 2006/1/1
Y1 - 2006/1/1
N2 - Many approximation algorithms have been presented in the last decades for hard search problems. The focus of this paper is on cryptographic applications, where it is desired to design algorithms which do not leak unnecessary information. Specifically, we are interested in private approximation algorithms - efficient algorithms whose output does not leak information not implied by the optimal solutions to the search problems. Privacy requirements add constraints on the approximation algorithms; in particular, known approximation algorithms usually leak a lot of information. For functions, [Feigenbaum et al., ICALP 2001] presented a natural requirement that a private algorithm should not leak information not implied by the original function. Generalizing this requirement to search problems is not straightforward as an input may have many different outputs. We present a new definition that captures a minimal privacy requirement from such algorithms - applied to an input instance, it should not leak any information that is not implied by its collection of exact solutions. Although our privacy requirement seems minimal, we show that for well studied problems, as vertex cover and max exact 3SAT, private approximation algorithms are unlikely to exist even for poor approximation ratios. Similar to [Halevi et al., STOC 2001], we define a relaxed notion of approximation algorithms that leak (little) information, and demonstrate the applicability of this notion by showing near optimal approximation algorithms for max exact 3SAT that leak little information.
AB - Many approximation algorithms have been presented in the last decades for hard search problems. The focus of this paper is on cryptographic applications, where it is desired to design algorithms which do not leak unnecessary information. Specifically, we are interested in private approximation algorithms - efficient algorithms whose output does not leak information not implied by the optimal solutions to the search problems. Privacy requirements add constraints on the approximation algorithms; in particular, known approximation algorithms usually leak a lot of information. For functions, [Feigenbaum et al., ICALP 2001] presented a natural requirement that a private algorithm should not leak information not implied by the original function. Generalizing this requirement to search problems is not straightforward as an input may have many different outputs. We present a new definition that captures a minimal privacy requirement from such algorithms - applied to an input instance, it should not leak any information that is not implied by its collection of exact solutions. Although our privacy requirement seems minimal, we show that for well studied problems, as vertex cover and max exact 3SAT, private approximation algorithms are unlikely to exist even for poor approximation ratios. Similar to [Halevi et al., STOC 2001], we define a relaxed notion of approximation algorithms that leak (little) information, and demonstrate the applicability of this notion by showing near optimal approximation algorithms for max exact 3SAT that leak little information.
KW - Algorithms
KW - Theory
UR - http://www.scopus.com/inward/record.url?scp=33748093542&partnerID=8YFLogxK
U2 - 10.1145/1132516.1132533
DO - 10.1145/1132516.1132533
M3 - Conference contribution
AN - SCOPUS:33748093542
SN - 1595931341
SN - 9781595931344
T3 - Proceedings of the Annual ACM Symposium on Theory of Computing
SP - 119
EP - 128
BT - STOC'06
PB - Association for Computing Machinery
T2 - 38th Annual ACM Symposium on Theory of Computing, STOC'06
Y2 - 21 May 2006 through 23 May 2006
ER -