“Approaches to convert RNNLM to BNLM”版本间的差异
来自cslt Wiki
(→related algorithm) |
|||
第7行: | 第7行: | ||
=related algorithm= | =related algorithm= | ||
* gibbs sampling[http://cos.name/2013/01/lda-math-mcmc-and-gibbs-sampling/] | * gibbs sampling[http://cos.name/2013/01/lda-math-mcmc-and-gibbs-sampling/] | ||
+ | * Kullback–Leibler divergence[http://zh.wikipedia.org/wiki/%E7%9B%B8%E5%AF%B9%E7%86%B5] |
2014年11月4日 (二) 13:41的版本
main paper
comparing approaches to convert recurrent neural networks into backoff language models for efficient decoding[1]
- VARIATIONAL APPROXIMATION OF LONG-SPAN LANGUAGE MODELS FOR LVCSR[2]
- Conversion of Recurrent Neural Network Language Models to Weighted Finite State Transducers for Automatic Speech Recognition[3]
- Converting Neural Network Language Models into Back-off Language Models for Efficient Decoding in Automatic Speech Recognition[4]