Xiangyu Zeng 2015-10-05

来自cslt Wiki
2015年10月12日 (一) 14:27Zengxy讨论 | 贡献的版本

(差异) ←上一版本 | 最后版本 (差异) | 下一版本→ (差异)
跳转至: 导航搜索

last week:

1.accomplished the sequence training of adam-max, and got some results which shows that with adjust-lr adam-max can get a better performance.

2.accomplished the code of multitask with speech rate, but still some problems haven't been solved, such as, it diverges after improving weight of speech rate.

this week:

1.supplement some experiments on adam-max sequence training, add noisy data into clean data and see if adam-max can adjust itself.

2.implement the multitask with speech rate.