Generalized source coding theorems and hypothesis testing: Part I — Information measures

Po-Ning Chen*, Fady Alajaji

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Expressions for ϵ-entropy rate, ϵ-mutual information rate and ϵ-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/sup-entropy/information/divergence rates of Han and Verdú. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the e-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability.

Original languageEnglish
Pages (from-to)283-292
Number of pages10
JournalJournal of the Chinese Institute of Engineers, Transactions of the Chinese Institute of Engineers,Series A/Chung-kuo Kung Ch'eng Hsuch K'an
Volume21
Issue number3
DOIs
StatePublished - 1 Jan 1998

Keywords

  • Divergence, e-capacity
  • Entropy
  • Information theory
  • Mutual information

Fingerprint

Dive into the research topics of 'Generalized source coding theorems and hypothesis testing: Part I — Information measures'. Together they form a unique fingerprint.

Cite this