Hierarchical pitman-yor and dirichlet process for language model

Jen-Tzung Chien*, Ying Lan Chang

*此作品的通信作者

研究成果: Conference article同行評審

1 引文 斯高帕斯(Scopus)

摘要

This paper presents a nonparametric interpretation for modern language model based on the hierarchical Pitman-Yor and Dirichlet (HPYD) process. We propose the HPYD language model (HPYD-LM) which flexibly conducts backoff smoothing and topic clustering through Bayesian nonparametric learning. The nonparametric priors of backoff n-grams and latent topics are tightly coupled in a compound process. A hybrid probability measure is drawn to build the smoothed topic-based LM. The model structure is automatically determined from training data. A new Chinese restaurant scenario is proposed to implement HPYD-LM via Gibbs sampling. This process reflects the power-law property and extracts the semantic topics from natural language. The superiority of HPYD-LM to the related LMs is demonstrated by the experiments on different corpora in terms of perplexity and word error rate.

原文English
頁(從 - 到)2212-2216
頁數5
期刊Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH
出版狀態Published - 1 一月 2013
事件14th Annual Conference of the International Speech Communication Association, INTERSPEECH 2013 - Lyon, France
持續時間: 25 八月 201329 八月 2013

指紋

深入研究「Hierarchical pitman-yor and dirichlet process for language model」主題。共同形成了獨特的指紋。

引用此