“NLP Status Report 2016-12-12”版本间的差异
来自cslt Wiki
| (6位用户的9个中间修订版本未显示) | |||
| 第2行: | 第2行: | ||
!Date !! People !! Last Week !! This Week | !Date !! People !! Last Week !! This Week | ||
|- | |- | ||
| − | | rowspan=" | + | | rowspan="6"|2016/12/12 |
|Yang Feng || | |Yang Feng || | ||
| + | *[[s2smn:]] installed tensorflow and ran a toy example (solved problems: version conflict and memory exhausted) | ||
| + | *wrote the code of the memory network part | ||
| + | *[[Huilan:]] prepared for periodical report and system submission. | ||
|| | || | ||
| + | *[[s2smn:]] finish the manual of nmt tensorflow | ||
| + | *[[Huilan:]] system submission | ||
|- | |- | ||
|Jiyuan Zhang || | |Jiyuan Zhang || | ||
*attempted to use memory model to improve the atten model of bad effect | *attempted to use memory model to improve the atten model of bad effect | ||
| − | *With the vernacular as the input,generated poem by local atten model | + | *With the vernacular as the input,generated poem by local atten model[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/2/2f/Local_atten_resluts.pdf] |
*Modified working mechanism of memory model(top1 to average) | *Modified working mechanism of memory model(top1 to average) | ||
*help andi | *help andi | ||
| 第15行: | 第20行: | ||
|- | |- | ||
|Andi Zhang || | |Andi Zhang || | ||
| + | *prepared a paraphrase data set that is enumerated from a previous one (ignoring words like "啊呀哈") | ||
| + | *worked on coding bidirectional model under tensorflow, met with NAN problem | ||
|| | || | ||
| − | + | *ignore NAN problem for now, run it on the same data set used in Theano | |
|- | |- | ||
|Shiyue Zhang || | |Shiyue Zhang || | ||
| + | * finished tsne pictures, and discussed with teachers | ||
| + | * tried experiments with 28-dim mem, but found almost all of them converged to baseline | ||
| + | * returned to 384-dim mem, which is still slightly better than basline. | ||
| + | * found the problem of action mem, one-hot vector is not proper. | ||
| + | * [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/2/2f/RNNG%2Bmm%E5%AE%9E%E9%AA%8C%E6%8A%A5%E5%91%8A.pdf report]] | ||
|| | || | ||
| + | * change one-hot vector to (0, -10000.0, -10000.0...) | ||
| + | * try 1-dim gate | ||
| + | * try max cos | ||
|- | |- | ||
|Guli || | |Guli || | ||
| − | + | *install and run moses | |
| + | *prepare thesis report | ||
|| | || | ||
| + | *read papers about Transfer learning and solving OOV | ||
| + | |- | ||
| + | |Peilun Xiao || | ||
| + | *Read a paper about document classification wiht GMM distributions of word vecotrs and try to code it in python | ||
| + | *Use LDA to reduce the dimension of the text in r52、r8 and contrast the performance of classification | ||
| + | || | ||
| + | *Use LDA to reduce the dimension of the text in 20news and webkb | ||
|} | |} | ||
2016年12月19日 (一) 00:14的最后版本
| Date | People | Last Week | This Week |
|---|---|---|---|
| 2016/12/12 | Yang Feng | ||
| Jiyuan Zhang |
|
| |
| Andi Zhang |
|
| |
| Shiyue Zhang |
|
| |
| Guli |
|
| |
| Peilun Xiao |
|
|