“ASR Status Report 2016-11-14”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
 
(3位用户的9个中间修订版本未显示)
第1行: 第1行:
 
{| class="wikitable"
 
{| class="wikitable"
 
!Date!!People !! Last Week !! This Week
 
!Date!!People !! Last Week !! This Week
 
 
 
 
|-
 
|-
 
| rowspan="5"|2016.11.14
 
| rowspan="5"|2016.11.14
 
|Hang Luo   
 
|Hang Luo   
 
||
 
||
*   
+
read papers about highway connection and multi-task
 
||  
 
||  
*   
+
Explore the language recognition model on speech+language joint training, find how to use languange information.
 +
*  finish ML-book
 
|-  
 
|-  
  
第18行: 第16行:
 
|Ying Shi   
 
|Ying Shi   
 
||  
 
||  
*   
+
kazaka recognition baseline finished [http://192.168.0.51:5555/cgi-bin/cvss/cvss_request.pl?account=shiying&step=view_request&cvssid=576  here]
 
||  
 
||  
*   
+
figuie of ml-book read paper nn visualization
 
|-
 
|-
  
第28行: 第26行:
 
|Yixiang Chen   
 
|Yixiang Chen   
 
||  
 
||  
*  
+
* Motify the ML-book and read paper.
 +
* Prepare the replay detection baseline.[http://192.168.0.51:5555/cgi-bin/cvss/cvss_request.pl?account=tangzy&step=view_request&cvssid=571]
 
||  
 
||  
*  
+
* Complete the replay baseline and attempt to modify MFCC calculation.
 
|-
 
|-
 
 
  
  
第39行: 第36行:
 
|Lantian Li   
 
|Lantian Li   
 
||  
 
||  
*  
+
* Complete the Joint-training on TASLP (speaker parts). [http://192.168.0.51:5555/cgi-bin/cvss/cvss_request.pl?account=tangzy&step=view_request&cvssid=573]
 +
* Joint-training on SRE and LRE (Still over-fitting !). [http://192.168.0.51:5555/cgi-bin/cvss/cvss_request.pl?account=tangzy&step=view_request&cvssid=574]
 +
* Read some papers and download four database. [http://cslt.riit.tsinghua.edu.cn/mediawiki/index.php/Data_resources]
 +
* CSLT-Replay detection database is OK! [/work4/lilt/Replay]
 
||  
 
||  
*  
+
* Joint-training on SRE and LRE.
 +
* Baseline system on replay detection.
 
|-
 
|-
  
第61行: 第62行:
 
{| class="wikitable"
 
{| class="wikitable"
 
!Date!!People !! Last Week !! This Week
 
!Date!!People !! Last Week !! This Week
 
 
 
 
|-
 
|-
 
| rowspan="5"|2016.11.07
 
| rowspan="5"|2016.11.07
第79行: 第77行:
 
|Ying Shi   
 
|Ying Shi   
 
||  
 
||  
*   
+
paper reading kazak speech recognition data prep
 
||  
 
||  
*   
+
baseline of kazak speech recoginition
 
|-
 
|-
 
  
  

2016年11月28日 (一) 01:14的最后版本

Date People Last Week This Week
2016.11.14 Hang Luo
  • read papers about highway connection and multi-task
  • Explore the language recognition model on speech+language joint training, find how to use languange information.
  • finish ML-book
Ying Shi
  • kazaka recognition baseline finished here
  • figuie of ml-book read paper nn visualization
Yixiang Chen
  • Motify the ML-book and read paper.
  • Prepare the replay detection baseline.[1]
  • Complete the replay baseline and attempt to modify MFCC calculation.
Lantian Li
  • Complete the Joint-training on TASLP (speaker parts). [2]
  • Joint-training on SRE and LRE (Still over-fitting !). [3]
  • Read some papers and download four database. [4]
  • CSLT-Replay detection database is OK! [/work4/lilt/Replay]
  • Joint-training on SRE and LRE.
  • Baseline system on replay detection.
Zhiyuan Tang
  • finished the additinal experiments of joint learning (speech & spk) for taslp (multi-task, ivector as part of input)[5].
  • prepare a brief review of interspeech16.
  • report for Weekly Reading (a brief review of interspeech16);
  • joint training for bilingual: language scores as decoding mask, explore the best info receivier by studying single tasks with extra info.




Date People Last Week This Week
2016.11.07 Hang Luo
  • make a report of 2 Inter Speech papers
  • run joint training experiments and the result is [6]
  • read papers about highway connection and multi-task.
Ying Shi
  • paper reading kazak speech recognition data prep
  • baseline of kazak speech recoginition
Yixiang Chen
Lantian Li
Zhiyuan Tang
  • almost finished the additinal experiments of joint learning (speech & spk) for taslp (multi-task, ivector as part of input)[7].
  • report for Weekly Reading;
  • joint training for bilingual.