Zhiyuan Tang 2015-08-31
来自cslt Wiki
Last week:
1. got some results on experiments on WSJ (bidirectional, more layers), seemed that more layers wouldn't help and
basic bidirectional neither, while pre-trained bidirectional one looked better;
2. got a glimpse of the capability of end-to-end ASR with B-LSTM on bigger data (1000+ hours), more to be waited;
3. revised the Chinese paper on Pronounciation Vector (following Language Vector).
This week:
1. get the result of fine-tuned bidirectional net on WSJ with dark knowledge, then conclude the experiments;
2. get more results of end-to-end ASR with B-LSTM on bigger data (1000+ hours)
3. some document/paper work.