“ASR Status Report 2016-11-21”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第5行: 第5行:
  
 
|-
 
|-
| rowspan="5"|2016.11.14
+
| rowspan="5"|2016.11.21
 
|Hang Luo   
 
|Hang Luo   
 
||
 
||

2016年11月21日 (一) 00:45的版本

Date People Last Week This Week


2016.11.21 Hang Luo
Ying Shi
  • fighting with kazak speech recognition system:because the huge size of HCLG.fst the decoding job always make the sever done.

There are several method I have tried

  • change the size or word list and corpus this method not worked very well
  • prune the LM .And the parameter been used to prune the LM is 2e-7 the size of LM reduce from 290M to 60M but the result about wer is very poor
  • I have upload some result about several experiment to CVSS (I ought to put a link here but I modify the page is my dormitory so I can't visit CVSS so...)
  • there are too much private affairs about myself so the job about visualization last week has been delayed I will try my best to finish it the week



Yixiang Chen
Lantian Li
Zhiyuan Tang




Date People Last Week This Week


2016.11.14 Hang Luo
  • read papers about highway connection and multi-task
  • Explore the language recognition model on speech+language joint training, find how to use languange information.
  • finish ML-book
Ying Shi
  • kazaka recognition baseline finished here
  • figuie of ml-book read paper nn visualization
Yixiang Chen
  • Motify the ML-book and read paper.
  • Prepare the replay detection baseline.[1]
  • Complete the replay baseline and attempt to modify MFCC calculation.
Lantian Li
  • Complete the Joint-training on TASLP (speaker parts). [2]
  • Joint-training on SRE and LRE (Still over-fitting !). [3]
  • Read some papers and download four database. [4]
  • CSLT-Replay detection database is OK! [/work4/lilt/Replay]
  • Joint-training on SRE and LRE.
  • Baseline system on replay detection.
Zhiyuan Tang
  • finished the additinal experiments of joint learning (speech & spk) for taslp (multi-task, ivector as part of input)[5].
  • prepare a brief review of interspeech16.
  • report for Weekly Reading (a brief review of interspeech16);
  • joint training for bilingual: language scores as decoding mask, explore the best info receivier by studying single tasks with extra info.