“Zhiyong Zhang”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第22行: 第22行:
 
* Testing on 100h-Ch+100h-En, better performance observed.
 
* Testing on 100h-Ch+100h-En, better performance observed.
 
* Now testing the source code on 1400h_8k data, but stange decoding results got.Need to further investigate.
 
* Now testing the source code on 1400h_8k data, but stange decoding results got.Need to further investigate.
 +
 +
=Reading Lists=
 +
*[[Speech Group Reading|张之勇 2014-12-28 APSIPA paper reading]]

2015年10月29日 (四) 07:08的版本


Papers To Read

  • 1, Learned-Norm pooling for deep feedforward and recurrent neural networks


Task schedules

Summary

   --------------------------------------------------------------------------------------------------------
    Priority | Tasks name                    |      Status          |     Notions
   --------------------------------------------------------------------------------------------------------    
        1    | Bi-Softmax                    | ■■■□□□□□□□ | 1400h am training and problem fixing
   --------------------------------------------------------------------------------------------------------
        2    | RNN+DAE                       | □□□□□□□□□□ |
   --------------------------------------------------------------------------------------------------------

Speech Recognition

Multi-lingual Am training

Bi-Softmax

  • Using two distinct softmax for English and Chinese data.
  • Testing on 100h-Ch+100h-En, better performance observed.
  • Now testing the source code on 1400h_8k data, but stange decoding results got.Need to further investigate.

Reading Lists