“2014-10-9”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“* RNNLM with LSTM [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/c/c6/Rnn_cslt.pdf]”为内容创建页面)
 
Lr讨论 | 贡献
 
(相同用户的2个中间修订版本未显示)
第1行: 第1行:
 
* RNNLM with LSTM [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/c/c6/Rnn_cslt.pdf]
 
* RNNLM with LSTM [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/c/c6/Rnn_cslt.pdf]
 +
* related paper:
 +
:* Y. Bengio, and R. Ducharme. A neural probabilistic language model. In Neural Information Processing Systems, volume 13, pages 932-938. 2001[http://www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf]
 +
:* Holger Schwenk; CSLM - A modular Open-Source Continuous Space Language Modeling Toolkit, in Interspeech, August 2013[http://www-lium.univ-lemans.fr/~schwenk/papers/Schwenk.cslm.is2013.pdf]
 +
:* Decoding with large-scale neural language models improves translation. Ashish Vaswani, Yinggong Zhao, Victoria Fossum, and David Chiang, 2013. InProceedings of EMNLP[http://nlg.isi.edu/software/nplm/]
 +
:* RNNLM Toolkit [http://www.fit.vutbr.cz/~imikolov/rnnlm/]
 +
:* slider of google [http://www.fit.vutbr.cz/~imikolov/rnnlm/]
 +
:* F. A. Gers. Long Short-Term Memory in Recurrent Neural Networks. PhD thesis, Switzerland, 2001. [http://felixgers.de/papers/phd.pdf]
 +
:* LSTM Neural Networks for Language Modeling ,Martin Sundermeyer, Ralf Schl¨uter, and Hermann Ney

2014年10月15日 (三) 04:28的最后版本

  • RNNLM with LSTM [1]
  • related paper:
  • Y. Bengio, and R. Ducharme. A neural probabilistic language model. In Neural Information Processing Systems, volume 13, pages 932-938. 2001[2]
  • Holger Schwenk; CSLM - A modular Open-Source Continuous Space Language Modeling Toolkit, in Interspeech, August 2013[3]
  • Decoding with large-scale neural language models improves translation. Ashish Vaswani, Yinggong Zhao, Victoria Fossum, and David Chiang, 2013. InProceedings of EMNLP[4]
  • RNNLM Toolkit [5]
  • slider of google [6]
  • F. A. Gers. Long Short-Term Memory in Recurrent Neural Networks. PhD thesis, Switzerland, 2001. [7]
  • LSTM Neural Networks for Language Modeling ,Martin Sundermeyer, Ralf Schl¨uter, and Hermann Ney