“Dongxu Zhang 14-10-27”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
Accomplished this week
 
(相同用户的2个中间修订版本未显示)
第1行: 第1行:
 
=== Accomplished this week ===
 
=== Accomplished this week ===
* rwthlm. Use ppl file to count n-best probability and test the lm.
+
* RWTHLM. Use ppl file to count n-best probability and test the lm.
 
* Give a short report about the shortcoming of BPTT and why lstm-rnn is better.
 
* Give a short report about the shortcoming of BPTT and why lstm-rnn is better.
 
* Using Baiduzhidao corpora with more preprocessing trained a lm with new raw dictionary.
 
* Using Baiduzhidao corpora with more preprocessing trained a lm with new raw dictionary.
第6行: 第6行:
 
=== Next week ===
 
=== Next week ===
 
* Test LSTM-Rnn LM. Read the rwthlm code.
 
* Test LSTM-Rnn LM. Read the rwthlm code.
* Continue lexion building.
+
* Continue to build vocabulary.
 
* Think of an idea in rnn or word2vec.
 
* Think of an idea in rnn or word2vec.
 
  
 
=== Myself ===
 
=== Myself ===
 
* Prepare for toefl test.
 
* Prepare for toefl test.
* prepare for master's thesis.
+
* Prepare for master's thesis.

2014年10月26日 (日) 14:29的最后版本

Accomplished this week

  • RWTHLM. Use ppl file to count n-best probability and test the lm.
  • Give a short report about the shortcoming of BPTT and why lstm-rnn is better.
  • Using Baiduzhidao corpora with more preprocessing trained a lm with new raw dictionary.

Next week

  • Test LSTM-Rnn LM. Read the rwthlm code.
  • Continue to build vocabulary.
  • Think of an idea in rnn or word2vec.

Myself

  • Prepare for toefl test.
  • Prepare for master's thesis.