Abstract
Advice complexity, introduced by Karp and Lipton, asks how many bits of "help" suffice to accept a given language. This is a notion that contains aspects both of informational and computational complexity, and captures non-uniform complexity. We are concerned with the connection between this notion and P-selective sets. The main question we study in our paper is how complex should the advice be as a function of the power of the interpreter, from the standpoint of average-case complexity. In the deterministic case, Ko proved that quadratic advice suffices, and Hemaspaandra and Torenvliet showed that linear advice is required. A long standing open problem is the question how to close this gap. We prove that in the probabilistic case linear size advice is enough, as long as this advice depends on the randomness. This is the first sub-quadratic result for the class P-sel for bounded-error probabilistic machines. As a consequence, several Karp-Lipton type theorems are obtained. Our methods are based on several fundamental concepts of theoretical computer science, as hardness amplification and Von Neumann's Minimax theorem, and demonstrate surprising connections between them and the seemingly unrelated notion of selectivity.
Original language | English |
---|---|
Pages (from-to) | 470-481 |
Number of pages | 12 |
Journal | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Volume | 7434 LNCS |
DOIs | |
State | Published - 6 Sep 2012 |
Event | 18th Annual International Computing and Combinatorics Conference, COCOON 2012 - Sydney, NSW, Australia Duration: 20 Aug 2012 → 22 Aug 2012 |
ASJC Scopus subject areas
- Theoretical Computer Science
- General Computer Science