Xiangyu Zeng 2015-11-09

来自cslt Wiki
跳转至: 导航搜索

last week:

1.did some additional sequence training of Adam-max, and these results show that adam-max is good for training to jump out the local minimum with new data

2.did some experiments on multitasks in speech rate. Found that sr-learning method is useful, especially the rate is extreme. But the multitask didn't show a good result. more in cvss.

next week:

1.complete the rest experiments of Adam-max sequence training

2.prepare my own application