“NLP Status Report 2016-11-21”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第23行: 第23行:
 
|-
 
|-
 
|Shiyue Zhang ||  
 
|Shiyue Zhang ||  
 
+
* run rnng on MKL successfully, which can double or triple the speed.
 +
* rerun the original model and get the final result
 +
* rerun the wrong memory models, still running
 +
* implement the dynamic memory model and get the result which is 0.22 better than baseline
 +
* try another structure of memory
 
||
 
||
 
+
* try more different models and summary the results
 
|-
 
|-
 
|Guli ||
 
|Guli ||

2016年11月21日 (一) 00:48的版本

Date People Last Week This Week
2016/11/21 Yang Feng
Jiyuan Zhang
Andi Zhang
  • prepare new data set for paraphrase, wiped out repetition & most of the noises
  • run NMT on fr-en data set and new paraphrase set
  • read through source code to find ways to modify it
  • helped Guli with running NMT on our server
  • decide to drop theano or not
  • start to work on codes
Shiyue Zhang
  • run rnng on MKL successfully, which can double or triple the speed.
  • rerun the original model and get the final result
  • rerun the wrong memory models, still running
  • implement the dynamic memory model and get the result which is 0.22 better than baseline
  • try another structure of memory
  • try more different models and summary the results
Guli