“NLP Status Report 2017-4-5”版本间的差异
来自cslt Wiki
第2行: | 第2行: | ||
!Date !! People !! Last Week !! This Week | !Date !! People !! Last Week !! This Week | ||
|- | |- | ||
− | | rowspan="6"|2017/ | + | | rowspan="6"|2017/4/5 |
|Yang Feng || | |Yang Feng || | ||
− | * | + | * Got the sampled 100w good data and ran Moses (BLEU: 30.6) |
− | * | + | * Reimplemented the idea of ACL (added some optimization to the previous code) and check the performance in the following gradual steps: 1. use s_i-1 as memory query; 2. use s_i-1+c_i |
− | + | as memory query; 3. use y as the memory states for attention; 4. use y + smt_attentions * h as memory states for attention. | |
− | + | * ran experiments for the above steps but the loss was inf. I am looking for reasons. | |
− | *ran | + | |
|| | || | ||
− | * | + | *do experiments and write the paper |
− | + | ||
− | + | ||
|- | |- | ||
|Jiyuan Zhang || | |Jiyuan Zhang || |
2017年4月5日 (三) 02:14的版本
Date | People | Last Week | This Week |
---|---|---|---|
2017/4/5 | Yang Feng |
as memory query; 3. use y as the memory states for attention; 4. use y + smt_attentions * h as memory states for attention.
|
|
Jiyuan Zhang |
|
| |
Andi Zhang |
|
| |
Shiyue Zhang |
|
| |
Peilun Xiao |