“2024-03-18”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第52行: 第52行:
 
** Focus on the experimental details of the few-shot paper from Google.
 
** Focus on the experimental details of the few-shot paper from Google.
 
** Try to address the 3 questions:
 
** Try to address the 3 questions:
**** How to change MT pretraining model structure?
+
*** How to change MT pretraining model structure?
**** How to train three strictly comparable pretraining models based on MT, Hubert, and wav2vec?
+
*** How to train three strictly comparable pretraining models based on MT, Hubert, and wav2vec?
**** Why does Hubert+MT perform significantly better?
+
*** Why does Hubert+MT perform significantly better?
 
||
 
||
 
*
 
*

2024年3月18日 (一) 08:41的版本

People This Week Next Week Task Tracking (DeadLine)
Dong Wang
Lantian Li
Ying Shi
Zhenghai You
Junming Yuan
  • Make the plan for the large vocabulary pretraining task
    • Focus on the experimental details of the few-shot paper from Google.
    • Try to address the 3 questions:
      • How to change MT pretraining model structure?
      • How to train three strictly comparable pretraining models based on MT, Hubert, and wav2vec?
      • Why does Hubert+MT perform significantly better?
Chen Chen
Xiaolou Li
Zehua Liu
Pengqi Li
Wan Lin
Tianhao Wang
Zhenyu Zhou
Junhui Chen
Jiaying Wang
Yu Zhang
Wenqiang Du
Yang Wei
Lily