Kullback-Leibler Divergence and Akaike Information Criterion in General Hidden Markov Models

Cheng Der Fuh, Chu Lan Michael Kao*, Tianxiao Pang

*此作品的通信作者

研究成果: Article同行評審

1 引文 斯高帕斯(Scopus)

摘要

To characterize the Kullback-Leibler divergence and Fisher information in general parametrized hidden Markov models, in this paper, we first show that the log likelihood and its derivatives can be represented as an additive functional of a Markovian iterated function system, and then provide explicit characterizations of these two quantities through this representation. Moreover, we show that Kullback-Leibler divergence can be locally approximated by a quadratic function determined by the Fisher information. Results relating to the Cramér-Rao lower bound and the Hájek-Le Cam local asymptotic minimax theorem are also given. As an application of our results, we provide a theoretical justification of using Akaike information criterion (AIC) model selection in general hidden Markov models. Last, we study three concrete models: a Gaussian vector autoregressive-moving average model of order (p,q) , recurrent neural networks, and temporal restricted Boltzmann machine, to illustrate our theory.

原文English
頁(從 - 到)5888-5909
頁數22
期刊IEEE Transactions on Information Theory
70
發行號8
DOIs
出版狀態Published - 2024

指紋

深入研究「Kullback-Leibler Divergence and Akaike Information Criterion in General Hidden Markov Models」主題。共同形成了獨特的指紋。

引用此