Xiangyu Zeng 2015-11-09

来自cslt Wiki
2015年11月9日 (一) 08:34Zengxy讨论 | 贡献的版本

(差异) ←上一版本 | 最后版本 (差异) | 下一版本→ (差异)
跳转至: 导航搜索

last week:

1.did some additional sequence training of Adam-max, and these results show that adam-max is good for training to jump out the local minimum with new data

2.did some experiments on multitasks in speech rate. Found that sr-learning method is useful, especially the rate is extreme. But the multitask didn't show a good result. more in cvss.

next week:

1.complete the rest experiments of Adam-max sequence training

2.prepare my own application