Date |
Person |
start |
leave |
hours |
status
|
2016/11/1
|
Andy Zhang |
10:00 |
19:00 |
8 |
- read source code
- weekly meeting
|
Shiyue Zhang |
9:15 |
19:30 |
8 |
- finish a simple way to add memory into rnng and write a report
- meeting
|
2016/11/2
|
Andy Zhang |
10:00 |
19:00 |
8 |
- tried to figure out the cost type of seq2seq
- bimonthly report
|
Shiyue Zhang |
9:15 |
20:00 |
8+ |
- fix unexpected action
- rerun original model
|
2016/11/3
|
Andy Zhang |
13:30 |
20:20 |
6 |
|
Shiyue Zhang |
9:35 |
19:00 |
8+ |
- rerun center memory model
- run sample memory model
- run wrong memory model
|
2016/11/4
|
Andy Zhang |
10:00 |
18:30 |
7+ |
- try to run code on GPU but failed
|
Shiyue Zhang |
|
|
|
|
2016/11/7
|
Andy Zhang |
10:00 |
19:20 |
8+ |
- try to run code on GPU but failed
|
Shiyue Zhang |
9:15 |
20:30 |
9 |
- review last week work
- write report
- meeting
|
2016/11/8
|
Andy Zhang |
9:40 |
19:00 |
8+ |
- finally ran the code on gpu, waiting for results to continue
|
Shiyue Zhang |
9:30 |
20:00 |
8+ |
- try to run rnng on GPU, but fail
|
2016/11/9
|
Andy Zhang |
10:00 |
19:00 |
8 |
- bugs when doing validation, solved it but had to rerun the code
- writing document of MemN2N
|
Shiyue Zhang |
9:30 |
20:00 |
8+ |
- try several wrong mem models
- rebuild rnng on Dynet
|
2016/11/10
|
Andy Zhang |
9:30 |
18:30 |
8 |
- writing document of MemN2N
|
Shiyue Zhang |
9:30 |
20:00 |
9 |
- read the code of Feng
- try to run rnng on GPU, but fail
|
2016/11/11
|
Andy Zhang |
9:30 |
18:30 |
8 |
- ran NTM on paraphrase data set
|
Shiyue Zhang |
|
|
|
|
2016/11/14
|
Andy Zhang |
9:30 |
18:30 |
8 |
- prepared data set of fr-en for NMT; prepare training, val & test set for paraphrase
- ran NMT model on the above data
- weekly meeting
|
Shiyue Zhang |
9:30 |
21:30 |
9+ |
- review last week work
- meeting
- try MKL, but it cannot use multi cpu cores
|
2016/11/15
|
Andy Zhang |
9:30 |
18:30 |
8 |
- deal with paraphrase data set
|
Shiyue Zhang |
9:30 |
20:00 |
8+ |
- run rnng on MKL successfully, which can at least double the speed.
- run the original rnng discriminative model
|
2016/11/16
|
Andy Zhang |
10:00 |
19:00 |
8 |
- deal with paraphrase data set, wipe out repetitions and some noises
- help Guli with NMT code
|
Shiyue Zhang |
9:30 |
20:00 |
8+ |
- finish the code of dynamic memory model and running
|
2016/11/17
|
Andy Zhang |
10:00 |
19:00 |
8 |
- run NTM on data dealt yesterday
- read through source code to find ways to modify it
|
Shiyue Zhang |
9:30 |
17:00 |
6 |
- try the memory structure mentioned by Teacher Wang
|
2016/11/18
|
Andy Zhang |
10:00 |
19:00 |
8 |
- read through source code to find ways to modify it
|
Shiyue Zhang |
|
|
|
|
2016/11/21
|
Andy Zhang |
9:30 |
19:00 |
8+ |
- read source code and seem to figure out how to modify it
|
Shiyue Zhang |
9:30 |
20:00 |
8+ |
- review the work of last week and meeting
- rerun original model, try more dynamic models
|
2016/11/22
|
Andy Zhang |
9:30 |
19:00 |
8+ |
- deal with zh2en data set
- run NTM on them
|
Shiyue Zhang |
|
|
|
|
2016/11/23
|
Andy Zhang |
9:30 |
19:00 |
8+ |
- discuss code with Mrs. Feng
|
Shiyue Zhang |
9:10 |
6:30 |
8+ |
- find a big bug in my code and modify it
- try the second memory structure with gate, and find a problem of memory
|
2016/11/24
|
Andy Zhang |
10:00 |
19:00 |
8 |
- met with crushes
- figure out relationship between checkpoint model & best model
|
Shiyue Zhang |
9:10 |
6:30 |
8+ |
- discuss the problem with Teacher Feng and Wang
- think about the solution of the problem
|
2016/11/25
|
Andy Zhang |
10:00 |
14:00 |
3+ |
- think about ways to output the result of encoder
- asked for a leave
|
Shiyue Zhang |
|
|
|
|
2016/11/28
|
Andy Zhang |
9:30 |
18:30 |
8+ |
- find ways to output the result of encoder
|
Shiyue Zhang |
|
|
|
|
2016/11/29
|
Andy Zhang |
9:30 |
18:30 |
8+ |
- turn to tensorflow
- read source code of seq2seq of tensorflow
|
Shiyue Zhang |
|
|
|
|
2016/11/30
|
Andy Zhang |
9:30 |
18:30 |
8+ |
- read source code of seq2seq of tensorflow
|
Shiyue Zhang |
|
|
|
|