“NLP Status Report 2016-12-19”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第44行: 第44行:
 
|-
 
|-
 
|Peilun Xiao ||
 
|Peilun Xiao ||
*Use LDA to generate 10-500 dimension document vector in the rest datasets
+
*use LDA to generate 10-500 dimension document vector in the rest datasets
*Write a python code about a new algorithm about tf-idf
+
*write a python code about a new algorithm about tf-idf
 
||
 
||
*Debug the code
+
*debug the code
 
|}
 
|}

2016年12月20日 (二) 05:21的版本

Date People Last Week This Week
2016/12/19 Yang Feng
  • s2smn: wrote the manual of s2s with tensorflow [nmt-manual]
  • wrote part of the code of mn.
  • wrote the manual of Moses [moses-manual]
  • Huilan: fixed the problem of syntax-based translation.
  • sort out the system and corresponding documents.
Jiyuan Zhang
  • coded tone_model,but had some trouble
  • run global_attention_model that decodes four sentences, fourfivegenerated by local_attention model
  • improve poem model
Andi Zhang
  • tried to modify the wrong softmax, but abandoned at last
  • added bleu scoring part
  • extract encoder outputs
Shiyue Zhang
  • changed the one-hot vector to (0, -inf, -inf...), and retied the experiments. But no improvement showed.
  • tried 1-dim gate, but converged to baseline
  • tried to only train gate, but the best is taking all instance as "right"
  • trying a model similar to attention
  • [report]
  • try to add true action info when training gate
  • try different scale vectors
  • try to change cos to only inner product
Guli
  • read papers about Transfer learning and solving OOV
  • conducted comparative test
  • writing survey
  • complete the first draft of the survey
Peilun Xiao
  • use LDA to generate 10-500 dimension document vector in the rest datasets
  • write a python code about a new algorithm about tf-idf
  • debug the code