“2025-03-10”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第208行: 第208行:
 
|Yue Gu
 
|Yue Gu
 
||
 
||
*  
+
* a 0.4% CER reduction has achieved for one spk, but no improvement was discovered on other spks. I'm still do some exps.
 +
* restart the synthetic-data related exps, try to fill the gap between synthetic data and real data on the output distribution of model.
 
||
 
||
 
*
 
*

2025年3月10日 (一) 10:33的版本

People This Week Next Week Task Tracking (DeadLine)
Dong Wang
Lantian Li
Ying Shi
  • Compare Ascend and Nvidia
    • Performance: Clean ASR task 20epochs WER 6.91% : 7.02% (Ascend vs Nvidia)
    • Speed: Nvidia is one time faster than Ascend
  • Start think about my thesis
Zhenghai You
Junming Yuan
  • Pretraining work:
    • MT-HuBERT & Cocktail-HuBERT will be finished next week.
    • Get a set of comparable finetuning results(15/5/3-shot) for each pretrain model at the 400K training step.[1]
  • Check and add reference for AI junior high school handbook(1/2).(Done)
Xiaolou Li
Zehua Liu
Pengqi Li
  • Prepare the AI course for Tsinghua University Junior High School.
  • Add references to the handbook(junior high school version 1/2)(Done).
Wan Lin
Tianhao Wang
Xiaoxue Luo
Zhenyu Zhou
Junhui Chen
Jiaying Wang
Yu Zhang
Wenqiang Du
  • Check Primary handbook V3.0(Done)
    • Add reference(80%)
Yang Wei
Turi
Yue Gu
  • a 0.4% CER reduction has achieved for one spk, but no improvement was discovered on other spks. I'm still do some exps.
  • restart the synthetic-data related exps, try to fill the gap between synthetic data and real data on the output distribution of model.
Qi Qu