“2019-02-20”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
 
(7位用户的8个中间修订版本未显示)
第9行: 第9行:
 
|Yibo Liu
 
|Yibo Liu
 
||  
 
||  
*  
+
* Reconstructed the model on pytorch.
 
||
 
||
*  
+
* 1.Tune the parameters and train a model with good results.
 +
* 2.Add planning and polishing procedure.
 
||
 
||
 
*   
 
*   
第22行: 第23行:
 
|Xiuqi Jiang
 
|Xiuqi Jiang
 
||  
 
||  
*  
+
* Reproducing the planning part and provide a simple framework of planning procedure.
 
||  
 
||  
*  
+
* Post polishing still remains to be finished.
 
||
 
||
 
*   
 
*   
第48行: 第49行:
 
|Zhaodi Qi
 
|Zhaodi Qi
 
||  
 
||  
*  
+
* finished x-vector system
 
||  
 
||  
*  
+
* improve the system ,because the results is bad.
 
||
 
||
 
*   
 
*   
第61行: 第62行:
 
|Jiawei Yu
 
|Jiawei Yu
 
||  
 
||  
*  
+
* Familiar with the code of attention experiment.
 +
* Modify the code to achieve phonetic attention.
 
||  
 
||  
*   
+
finish the phonetic attention experiment.
 
||
 
||
 
*   
 
*   
第87行: 第89行:
 
*
 
*
 
||  
 
||  
*
+
*Continue to study the time complexity of TT-decomposition
 
||
 
||
 
*   
 
*   
第98行: 第100行:
 
|Yang Zhang
 
|Yang Zhang
 
||
 
||
*
+
* learn tf
 
||  
 
||  
 +
* did some experiments in mnist dataset
 +
||
 
*  
 
*  
||
 
 
 
|-
 
|-
  
第111行: 第113行:
 
|Wenwei Dong
 
|Wenwei Dong
 
||
 
||
*
+
*ivector+fbank as feature compare with fbank in GOP
 
||  
 
||  
*  
+
*read papers, find some new speaker adaptation methods
 
||
 
||
 
*   
 
*   

2019年2月20日 (三) 04:51的最后版本

People Last Week This Week Task Tracking (DeadLine)
Yibo Liu
  • Reconstructed the model on pytorch.
  • 1.Tune the parameters and train a model with good results.
  • 2.Add planning and polishing procedure.
Xiuqi Jiang
  • Reproducing the planning part and provide a simple framework of planning procedure.
  • Post polishing still remains to be finished.
Jiayao Wu
  • finished several experiments on node-sparseness
  • keep on doing experiments
Zhaodi Qi
  • finished x-vector system
  • improve the system ,because the results is bad.
Jiawei Yu
  • Familiar with the code of attention experiment.
  • Modify the code to achieve phonetic attention.
  • finish the phonetic attention experiment.
Yunqi Cai
  • 1,use the ASR test_data set ref.txt to test the original bert and then compare with the fine-tuned Bert.
  • 2,compare the difference between hyp.text which comes from ASR test results, and investigate how to mask the error in the sentence of the ASR result.
  • 3,set a rule among all the result(hyp.text) to find out the error to be masked.
Dan He
  • Continue to study the time complexity of TT-decomposition
Yang Zhang
  • learn tf
  • did some experiments in mnist dataset
Wenwei Dong
  • ivector+fbank as feature compare with fbank in GOP
  • read papers, find some new speaker adaptation methods