“2019-01-23”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“ {| class="wikitable" !People !! Last Week !! This Week !! Task Tracking (<font color="red">DeadLine</font>) |- |- |Yibo Liu || * || * |- |- |Xiuqi Jiang...”为内容创建页面)
 
 
(8位用户的14个中间修订版本未显示)
第10行: 第10行:
 
|Yibo Liu
 
|Yibo Liu
 
||  
 
||  
*  
+
* Started to reconstruct the vivi code with better structures.
 +
||
 +
* Especially need to build proper models for planning and post processing.
 
||
 
||
 
*   
 
*   
第21行: 第23行:
 
|Xiuqi Jiang
 
|Xiuqi Jiang
 
||  
 
||  
*  
+
* Designed a better code structure for further experiments.
 +
* Improved vivi2.0 and made some adjustments to .sh script.
 
||  
 
||  
*
+
* Build codes under the new structure.
 
||
 
||
 
*   
 
*   
第34行: 第37行:
 
|Jiayao Wu
 
|Jiayao Wu
 
||  
 
||  
*  
+
* do experiments on node_sparseness and update it on cvss
 +
* re-label some data
 
||
 
||
*  
+
* keep on doing experiments on pruning
 +
* get familiar with pytorch
 
||  
 
||  
 
*  
 
*  
第47行: 第52行:
 
|Zhaodi Qi
 
|Zhaodi Qi
 
||  
 
||  
*  
+
* Reduce the lid model and test the results
 +
* Test test set of different channels
 +
* Wrote a model based on asr(tdnn-f)-lid(tdnn) (similar to PTN) to solve channel inconsistency
 
||  
 
||  
*  
+
* Complete the asr-lid model
 
||
 
||
 
*   
 
*   
第60行: 第67行:
 
|Jiawei Yu
 
|Jiawei Yu
 
||  
 
||  
*  
+
* wrote a tensorflow learning document, and I have not completed it.
 +
* read some papers about attention and find some attention code in GitHub.
 
||  
 
||  
 +
* try to run these attention code figure out mechanism of this code.
 +
||
 
*  
 
*  
||
 
 
 
|-
 
|-
  
第71行: 第79行:
 
|Yunqi Cai
 
|Yunqi Cai
 
||  
 
||  
*
+
*1,figured out how the Bert model create the pretraining data and do the pretraining.
 +
*2,try to use the Bert to do the error correction of a text sentence.
 +
*3,re-label some ASR data
 +
*4,Test vivi2.0 model
 
||  
 
||  
*  
+
*Construct a Text sentence error correction model
 
||
 
||
 
*   
 
*   
第82行: 第93行:
 
|Dan He
 
|Dan He
 
||  
 
||  
*
+
*Do experiments on comparing test time and update it on cvss
 +
*Read the experiment code carefully
 
||  
 
||  
*
+
*Directly decompose the trained parameters and put them into the network for retraining.
 
||
 
||
 
*   
 
*   
第94行: 第106行:
 
|-
 
|-
 
|Yang Zhang
 
|Yang Zhang
 +
||
 +
* 1. remodified nginx configuration and changed the server networking structure
 +
* 2. tried to learn vae and did a [https://github.com/hwalsuklee/tensorflow-mnist-VAE test] in wolf server
 +
||
 +
* continue to learn and test VAE
 +
||
 +
 +
|-
 +
 +
 +
|-
 +
|Wenwei Dong
 
||
 
||
 
*  
 
*  
第101行: 第125行:
 
*   
 
*   
 
|-
 
|-
 +
  
  
 
|}
 
|}

2019年1月24日 (四) 02:14的最后版本

People Last Week This Week Task Tracking (DeadLine)
Yibo Liu
  • Started to reconstruct the vivi code with better structures.
  • Especially need to build proper models for planning and post processing.
Xiuqi Jiang
  • Designed a better code structure for further experiments.
  • Improved vivi2.0 and made some adjustments to .sh script.
  • Build codes under the new structure.
Jiayao Wu
  • do experiments on node_sparseness and update it on cvss
  • re-label some data
  • keep on doing experiments on pruning
  • get familiar with pytorch
Zhaodi Qi
  • Reduce the lid model and test the results
  • Test test set of different channels
  • Wrote a model based on asr(tdnn-f)-lid(tdnn) (similar to PTN) to solve channel inconsistency
  • Complete the asr-lid model
Jiawei Yu
  • wrote a tensorflow learning document, and I have not completed it.
  • read some papers about attention and find some attention code in GitHub.
  • try to run these attention code figure out mechanism of this code.
Yunqi Cai
  • 1,figured out how the Bert model create the pretraining data and do the pretraining.
  • 2,try to use the Bert to do the error correction of a text sentence.
  • 3,re-label some ASR data
  • 4,Test vivi2.0 model
  • Construct a Text sentence error correction model
Dan He
  • Do experiments on comparing test time and update it on cvss
  • Read the experiment code carefully
  • Directly decompose the trained parameters and put them into the network for retraining.
Yang Zhang
  • 1. remodified nginx configuration and changed the server networking structure
  • 2. tried to learn vae and did a test in wolf server
  • continue to learn and test VAE
Wenwei Dong