TY - GEN

T1 - Extracting computational entropy and learning noisy linear functions

AU - Lee, Chia Jung

AU - Lu, Chi Jen

AU - Tsai, Shi-Chun

PY - 2009

Y1 - 2009

N2 - We study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution (f(χ)| χ ), for some function f and some distribution χ, and we say that such a source has computational min-entropy k if any circuit of size 2 k can only predict f(x) correctly with probability at most 2-k given input x sampled from χ. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multi-source extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in learning theory: learning linear functions under arbitrary distribution with adversarial noise. For this problem, we provide a learning algorithm, which may have interest of its own.

AB - We study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution (f(χ)| χ ), for some function f and some distribution χ, and we say that such a source has computational min-entropy k if any circuit of size 2 k can only predict f(x) correctly with probability at most 2-k given input x sampled from χ. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multi-source extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in learning theory: learning linear functions under arbitrary distribution with adversarial noise. For this problem, we provide a learning algorithm, which may have interest of its own.

UR - http://www.scopus.com/inward/record.url?scp=76249103483&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-02882-3_34

DO - 10.1007/978-3-642-02882-3_34

M3 - Conference contribution

AN - SCOPUS:76249103483

SN - 3642028810

SN - 9783642028816

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 338

EP - 347

BT - Computing and Combinatorics - 15th Annual International Conference, COCOON 2009, Proceedings

T2 - 15th Annual International Conference on Computing and Combinatorics, COCOON 2009

Y2 - 13 July 2009 through 15 July 2009

ER -