Schedule

来自cslt Wiki
2016年11月29日 (二) 04:31Fengyang讨论 | 贡献的版本

跳转至: 导航搜索

NLP Schedule

Members

Current Members

  • Yang Feng (冯洋)
  • Jiyuan Zhang (张记袁)
  • Aodong Li (李傲冬)
  • Andi Zhang (张安迪)
  • Aiting Liu (刘艾婷)
  • Shiyao Li (李诗瑶)
  • Shiyue Zhang (张诗悦)

Former Members

  • Chao Xing (邢超)  : FreeNeb
  • Rong Liu (刘荣)  : 优酷
  • Xiaoxi Wang (王晓曦) : 图灵机器人
  • Xi Ma (马习)  : 清华大学研究生
  • Tianyi Luo (骆天一) : phd candidate in University of California Santa Cruz
  • Qixin Wang (王琪鑫)  : MA candidate in University of California
  • DongXu Zhang (张东旭): --
  • Yiqiao Pan (潘一桥) : MA candidate in University of Sydney


Work Progress

Daily Report

Date Person start leave hours status
2016/11/1 Andy Zhang 10:00 19:00 8
  • read source code
  • weekly meeting
Shiyue Zhang 9:15 19:30 8
  • finish a simple way to add memory into rnng and write a report
  • meeting
2016/11/2 Andy Zhang 10:00 19:00 8
  • tried to figure out the cost type of seq2seq
  • bimonthly report
Shiyue Zhang 9:15 20:00 8+
  • fix unexpected action
  • rerun original model
2016/11/3 Andy Zhang 13:30 20:20 6
  • read source code
Shiyue Zhang 9:35 19:00 8+
  • rerun center memory model
  • run sample memory model
  • run wrong memory model
2016/11/4 Andy Zhang 10:00 18:30 7+
  • try to run code on GPU but failed
Shiyue Zhang
2016/11/7 Andy Zhang 10:00 19:20 8+
  • try to run code on GPU but failed
Shiyue Zhang 9:15 20:30 9
  • review last week work
  • write report
  • meeting
2016/11/8 Andy Zhang 9:40 19:00 8+
  • finally ran the code on gpu, waiting for results to continue
Shiyue Zhang 9:30 20:00 8+
  • try to run rnng on GPU, but fail
2016/11/9 Andy Zhang 10:00 19:00 8
  • bugs when doing validation, solved it but had to rerun the code
  • writing document of MemN2N
Shiyue Zhang 9:30 20:00 8+
  • try several wrong mem models
  • rebuild rnng on Dynet
2016/11/10 Andy Zhang 9:30 18:30 8
  • writing document of MemN2N
Shiyue Zhang 9:30 20:00 9
  • read the code of Feng
  • try to run rnng on GPU, but fail
2016/11/11 Andy Zhang 9:30 18:30 8
  • ran NTM on paraphrase data set
Shiyue Zhang
2016/11/14 Andy Zhang 9:30 18:30 8
  • prepared data set of fr-en for NMT; prepare training, val & test set for paraphrase
  • ran NMT model on the above data
  • weekly meeting
Shiyue Zhang 9:30 21:30 9+
  • review last week work
  • meeting
  • try MKL, but it cannot use multi cpu cores
2016/11/15 Andy Zhang 9:30 18:30 8
  • deal with paraphrase data set
Shiyue Zhang 9:30 20:00 8+
  • run rnng on MKL successfully, which can at least double the speed.
  • run the original rnng discriminative model
2016/11/16 Andy Zhang 10:00 19:00 8
  • deal with paraphrase data set, wipe out repetitions and some noises
  • help Guli with NMT code
Shiyue Zhang 9:30 20:00 8+
  • finish the code of dynamic memory model and running
2016/11/17 Andy Zhang 10:00 19:00 8
  • run NTM on data dealt yesterday
  • read through source code to find ways to modify it
Shiyue Zhang 9:30 17:00 6
  • try the memory structure mentioned by Teacher Wang
2016/11/18 Andy Zhang 10:00 19:00 8
  • read through source code to find ways to modify it
Shiyue Zhang
2016/11/21 Andy Zhang 9:30 19:00 8+
  • read source code and seem to figure out how to modify it
Shiyue Zhang 9:30 20:00 8+
  • review the work of last week and meeting
  • rerun original model, try more dynamic models
2016/11/22 Andy Zhang 9:30 19:00 8+
  • deal with zh2en data set
  • run NTM on them
Shiyue Zhang
2016/11/23 Andy Zhang 9:30 19:00 8+
  • discuss code with Mrs. Feng
Shiyue Zhang 9:10 6:30 8+
  • find a big bug in my code and modify it
  • try the second memory structure with gate, and find a problem of memory
2016/11/24 Andy Zhang 10:00 19:00 8
  • met with crushes
  • figure out relationship between checkpoint model & best model
Shiyue Zhang 9:10 6:30 8+
  • discuss the problem with Teacher Feng and Wang
  • think about the solution of the problem
2016/11/25 Andy Zhang
Shiyue Zhang

Monthly Summary

People Summary
Yang Feng
Jiyuan Zhang
Andy Zhang
Shiyue Zhang

Time Off Table

11/28 || 8h || 11/29 || 2h ||
Date Yang Feng Jiyuan Zhang
11/01 8h
11/21 2h

Past progress

nlp-progress 2016/10

nlp-progress 2016/09

nlp-progress 2016/08

nlp-progress 2016/05-07

nlp-progress 2016/04