Nlp-progress 2017/02

来自cslt Wiki
跳转至: 导航搜索

Daily Report

Date Person start leave hours status
2017/2/5 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for ACL paper
Peilun Xiao
Guli
2017/2/6 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for ACL paper
Peilun Xiao
Guli
2017/2/7 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for ACL paper
Peilun Xiao
Guli
2017/2/8 Andy Zhang
Shiyue Zhang 11:30 20:00 6+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/9 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/10 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/11 Andy Zhang
Shiyue Zhang 14:30 20:00 5+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/13 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/14 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/15 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/16 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/17 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/18 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/19 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • prepare for IJCAI paper
Peilun Xiao
Guli
2017/2/21 Andy Zhang
Shiyue Zhang 9:30 20:00 9+
  • found the reason of loss not go down
Peilun Xiao
Guli
2017/2/22 Andy Zhang
Shiyue Zhang 9:00 20:00 9+
  • use cos to computer alignments, and loss can go down
  • replace original attention with cos attention, partly and fully train
Peilun Xiao
Guli
2017/2/27 Andy Zhang
Shiyue Zhang 9:30 19:00 8+
  • find the tanh linear problem
  • use 20*cos replace tanh, get 44.8 blue
Peilun Xiao
2017/2/28 Andy Zhang 12:00 19:00 7
  • read theano code of NMT, try to find out how it generated encoder input masks
  • did some coding of input masks on baseline_beam, need to be further tested
Shiyue Zhang 9:30 19:00 8 +
  • add encoding and attention mask
  • try to find the problem of mem attention
Peilun Xiao