TY - GEN

T1 - Almost Chor-Goldreich Sources and Adversarial Random Walks

AU - Doron, Dean

AU - Moshkovitz, Dana

AU - Oh, Justin

AU - Zuckerman, David

N1 - Publisher Copyright:
© 2023 Owner/Author.

PY - 2023/6/2

Y1 - 2023/6/2

N2 - A Chor-Goldreich (CG) source is a sequence of random variables X = X1 . Xt, where each Xi ∼{0,1}d and Xi has δd min-entropy conditioned on any fixing of X1 . Xi-1. The parameter 0<-≤ 1 is the entropy rate of the source. We typically think of d as constant and t as growing. We extend this notion in several ways, defining almost CG sources. Most notably, we allow each Xi to only have conditional Shannon entropy δd. We achieve pseudorandomness results for almost CG sources which were not known to hold even for standard CG sources, and even for the weaker model of Santha-Vazirani sources: We construct a deterministic condenser that on input X, outputs a distribution which is close to having constant entropy gap, namely a distribution Z ∼{0,1}m for m ≈ δdt with min-entropy m-O(1). Therefore, we can simulate any randomized algorithm with small failure probability using almost CG sources with no multiplicative slowdown. This result extends to randomized protocols as well, and any setting in which we cannot simply cycle over all seeds, and a "one-shot"simulation is needed. Moreover, our construction works in an online manner, since it is based on random walks on expanders. Our main technical contribution is a novel analysis of random walks, which should be of independent interest. We analyze walks with adversarially correlated steps, each step being entropy-deficient, on good enough lossless expanders. We prove that such walks (or certain interleaved walks on two expanders), starting from a fixed vertex and walking according to X1 . Xt, accumulate most of the entropy in X.

AB - A Chor-Goldreich (CG) source is a sequence of random variables X = X1 . Xt, where each Xi ∼{0,1}d and Xi has δd min-entropy conditioned on any fixing of X1 . Xi-1. The parameter 0<-≤ 1 is the entropy rate of the source. We typically think of d as constant and t as growing. We extend this notion in several ways, defining almost CG sources. Most notably, we allow each Xi to only have conditional Shannon entropy δd. We achieve pseudorandomness results for almost CG sources which were not known to hold even for standard CG sources, and even for the weaker model of Santha-Vazirani sources: We construct a deterministic condenser that on input X, outputs a distribution which is close to having constant entropy gap, namely a distribution Z ∼{0,1}m for m ≈ δdt with min-entropy m-O(1). Therefore, we can simulate any randomized algorithm with small failure probability using almost CG sources with no multiplicative slowdown. This result extends to randomized protocols as well, and any setting in which we cannot simply cycle over all seeds, and a "one-shot"simulation is needed. Moreover, our construction works in an online manner, since it is based on random walks on expanders. Our main technical contribution is a novel analysis of random walks, which should be of independent interest. We analyze walks with adversarially correlated steps, each step being entropy-deficient, on good enough lossless expanders. We prove that such walks (or certain interleaved walks on two expanders), starting from a fixed vertex and walking according to X1 . Xt, accumulate most of the entropy in X.

KW - Santha-Vazirani sources

KW - condensers

KW - expander Graphs

KW - extractors

KW - random Walks

KW - randomized algorithm

UR - http://www.scopus.com/inward/record.url?scp=85163137820&partnerID=8YFLogxK

U2 - 10.1145/3564246.3585134

DO - 10.1145/3564246.3585134

M3 - Conference contribution

AN - SCOPUS:85163137820

T3 - Proceedings of the Annual ACM Symposium on Theory of Computing

SP - 1

EP - 9

BT - STOC 2023 - Proceedings of the 55th Annual ACM Symposium on Theory of Computing

A2 - Saha, Barna

A2 - Servedio, Rocco A.

PB - Association for Computing Machinery

T2 - 55th Annual ACM Symposium on Theory of Computing, STOC 2023

Y2 - 20 June 2023 through 23 June 2023

ER -