TY - GEN

T1 - Deterministic extractors for independent-symbol sources

AU - Lee, Chia Jung

AU - Lu, Chi Jen

AU - Tsai, Shi-Chun

PY - 2006

Y1 - 2006

N2 - In this paper, we consider the task of deterministically extracting randomness from sources consisting of a sequence of n independent symbols from {0, 1}d. The only randomness guarantee on such a source is that the whole source has min-entropy k. We give an explicit deterministic extractor which can extract Ω(log k -log d-log log(1/ε)) bits with error ε, for any n, d, k ∈ ℕ and ε ∈ (0, 1). For sources with a larger min-entropy, we can extract even more randomness, When k ≥ n 1/2+γ, for any constant γ ∈ (0, 1/2), we can extract m = k - O(d log (1/ε)) bits with any error ε ≥ 2 Ωnγ, When k ≥ logc n, for some constant c > 0, we can extract m = k -d(1/ε)O(1) bits with any error ε ≥ k-Ω(1). Our results generalize those of Kamp & Zuckerman and Gabizon et al. which only work for bit-fixing sources (with d = 1 and each bit of the source being either fixed or perfectly random). Moreover, we show the existence of a non-explicit deterministic extractor which can extract m = k-O(log(1/ε)) bits whenever k = w(d+log(n/ε)), Finally, we show that even to extract from bit-fixing sources, any extractor, seeded or not, must suffer an entropy loss k -m = ω(log(1/ε)). This generalizes a lower bound of Radhakrishnan & Ta-Shma with respect to general sources.

AB - In this paper, we consider the task of deterministically extracting randomness from sources consisting of a sequence of n independent symbols from {0, 1}d. The only randomness guarantee on such a source is that the whole source has min-entropy k. We give an explicit deterministic extractor which can extract Ω(log k -log d-log log(1/ε)) bits with error ε, for any n, d, k ∈ ℕ and ε ∈ (0, 1). For sources with a larger min-entropy, we can extract even more randomness, When k ≥ n 1/2+γ, for any constant γ ∈ (0, 1/2), we can extract m = k - O(d log (1/ε)) bits with any error ε ≥ 2 Ωnγ, When k ≥ logc n, for some constant c > 0, we can extract m = k -d(1/ε)O(1) bits with any error ε ≥ k-Ω(1). Our results generalize those of Kamp & Zuckerman and Gabizon et al. which only work for bit-fixing sources (with d = 1 and each bit of the source being either fixed or perfectly random). Moreover, we show the existence of a non-explicit deterministic extractor which can extract m = k-O(log(1/ε)) bits whenever k = w(d+log(n/ε)), Finally, we show that even to extract from bit-fixing sources, any extractor, seeded or not, must suffer an entropy loss k -m = ω(log(1/ε)). This generalizes a lower bound of Radhakrishnan & Ta-Shma with respect to general sources.

UR - http://www.scopus.com/inward/record.url?scp=33746379296&partnerID=8YFLogxK

U2 - 10.1007/11786986_9

DO - 10.1007/11786986_9

M3 - Conference contribution

AN - SCOPUS:33746379296

SN - 3540359044

SN - 9783540359043

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 84

EP - 95

BT - Automata, Languages and Programming - 33rd International Colloquium, ICALP 2006, Proceedings

PB - Springer Verlag

T2 - 33rd International Colloquium on Automata, Languages and Programming, ICALP 2006

Y2 - 10 July 2006 through 14 July 2006

ER -