“Dongxu Zhang 14-10-27”版本间的差异
来自cslt Wiki
(→Next week) |
(→Accomplished this week) |
||
(相同用户的一个中间修订版本未显示) | |||
第1行: | 第1行: | ||
=== Accomplished this week === | === Accomplished this week === | ||
− | * | + | * RWTHLM. Use ppl file to count n-best probability and test the lm. |
* Give a short report about the shortcoming of BPTT and why lstm-rnn is better. | * Give a short report about the shortcoming of BPTT and why lstm-rnn is better. | ||
* Using Baiduzhidao corpora with more preprocessing trained a lm with new raw dictionary. | * Using Baiduzhidao corpora with more preprocessing trained a lm with new raw dictionary. | ||
第11行: | 第11行: | ||
=== Myself === | === Myself === | ||
* Prepare for toefl test. | * Prepare for toefl test. | ||
− | * | + | * Prepare for master's thesis. |
2014年10月26日 (日) 14:29的最后版本
Accomplished this week
- RWTHLM. Use ppl file to count n-best probability and test the lm.
- Give a short report about the shortcoming of BPTT and why lstm-rnn is better.
- Using Baiduzhidao corpora with more preprocessing trained a lm with new raw dictionary.
Next week
- Test LSTM-Rnn LM. Read the rwthlm code.
- Continue to build vocabulary.
- Think of an idea in rnn or word2vec.
Myself
- Prepare for toefl test.
- Prepare for master's thesis.