Expressions for ϵ-entropy rate, ϵ-mutual information rate and ϵ-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/sup-entropy/information/divergence rates of Han and Verdú. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the e-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability.
|頁（從 - 到）||283-292|
|期刊||Journal of the Chinese Institute of Engineers, Transactions of the Chinese Institute of Engineers,Series A/Chung-kuo Kung Ch'eng Hsuch K'an|
|出版狀態||Published - 1 1月 1998|