TY - JOUR

T1 - Csiszár's cutoff rates for the general hypothesis testing problem

AU - Alajaji, Fady

AU - Chen, Po-Ning

AU - Rached, Ziad

PY - 2004/4/1

Y1 - 2004/4/1

N2 - In [6], Csiszár established the concept of forward β-cutoff rate for the error exponent hypothesis testing problem based on independent and identically distributed (i.i.d.) observations. Given β < 0, he defined the forward β-cutoff rate as the number R0 ≥ 0 that provides the best possible lower bound in the form β(E - Ro) to the type 1 error exponent function for hypothesis testing where 0 < E < R0 is the rate of exponential convergence to 0 of the type 2 error probability. He then demonstrated that the forward β-cutoff rate is given by D1/(1-β) (X∥X̄), where Dα(X∥X̄) denotes the Rényi α-divergence [19], α > 0, α ≠ 1. Similarly, for 0 < β < 1, Csiszár also established the concept of reverse β-cutoff rate for the correct exponent hypothesis testing problem. In this work, we extend Csiszár's results by investigating the forward and reverse β-cutoff rates for the hypothesis testing between two arbitrary sources with memory. We demonstrate that the lim inf Rényi α-divergence rate provides the expression for the forward β-cutoff rate. We also show that if the log-likelihood large deviation spectrum admits a limit, then the reverse β-cutoff rate equals the liminf α-divergence rate, where α = 1/1-β and 0 < β < βmax, where βmax is the largest β < 1 for which the lim inf 1/1-β-divergence rate is finite. For βmax ≤ β ≤ 1, we show that the reverse cutoff rate is in general only upper-bounded by the lim inf Rényi divergence rate. Unlike in [4], where the alphabet for the source coding cutoff rate problem was assumed to be finite, we assume arbitrary (countable or continuous) source alphabet. We also provide several examples to illustrate our forward and reverse β-cutoff rates results and the techniques employed to establish them.

AB - In [6], Csiszár established the concept of forward β-cutoff rate for the error exponent hypothesis testing problem based on independent and identically distributed (i.i.d.) observations. Given β < 0, he defined the forward β-cutoff rate as the number R0 ≥ 0 that provides the best possible lower bound in the form β(E - Ro) to the type 1 error exponent function for hypothesis testing where 0 < E < R0 is the rate of exponential convergence to 0 of the type 2 error probability. He then demonstrated that the forward β-cutoff rate is given by D1/(1-β) (X∥X̄), where Dα(X∥X̄) denotes the Rényi α-divergence [19], α > 0, α ≠ 1. Similarly, for 0 < β < 1, Csiszár also established the concept of reverse β-cutoff rate for the correct exponent hypothesis testing problem. In this work, we extend Csiszár's results by investigating the forward and reverse β-cutoff rates for the hypothesis testing between two arbitrary sources with memory. We demonstrate that the lim inf Rényi α-divergence rate provides the expression for the forward β-cutoff rate. We also show that if the log-likelihood large deviation spectrum admits a limit, then the reverse β-cutoff rate equals the liminf α-divergence rate, where α = 1/1-β and 0 < β < βmax, where βmax is the largest β < 1 for which the lim inf 1/1-β-divergence rate is finite. For βmax ≤ β ≤ 1, we show that the reverse cutoff rate is in general only upper-bounded by the lim inf Rényi divergence rate. Unlike in [4], where the alphabet for the source coding cutoff rate problem was assumed to be finite, we assume arbitrary (countable or continuous) source alphabet. We also provide several examples to illustrate our forward and reverse β-cutoff rates results and the techniques employed to establish them.

KW - α-divergence rate

KW - Arbitrary sources with memory

KW - Forward and reverse cutoff rates

KW - Hypothesis testing error and correct exponents

KW - Information spectrum

KW - Large deviation theory

UR - http://www.scopus.com/inward/record.url?scp=1942454256&partnerID=8YFLogxK

U2 - 10.1109/TIT.2004.825040

DO - 10.1109/TIT.2004.825040

M3 - Letter

AN - SCOPUS:1942454256

SN - 0018-9448

VL - 50

SP - 663

EP - 678

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

IS - 4

ER -