Generalized source coding theorems and hypothesis testing: Part I — Information measures

Po-Ning Chen*, Fady Alajaji

*此作品的通信作者

研究成果: Article同行評審

1 引文 斯高帕斯(Scopus)

摘要

Expressions for ϵ-entropy rate, ϵ-mutual information rate and ϵ-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/sup-entropy/information/divergence rates of Han and Verdú. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the e-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability.

指紋

深入研究「Generalized source coding theorems and hypothesis testing: Part I — Information measures」主題。共同形成了獨特的指紋。

引用此