“2024-02-26”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第138行: 第138行:
 
|Zhenyu Zhou
 
|Zhenyu Zhou
 
||  
 
||  
*
+
*Extensive Speaker Pertubation[https://z1et6d3xtb.feishu.cn/docx/DViBdvm8KoQMMXxMXC0cWp2vnPf]:
 +
**VTLP results in cn1&vox1
 +
**VTLP+Speed results in cn1&vox1
 +
**Future Experiment Design
 
||
 
||
 
*
 
*

2024年2月26日 (一) 10:49的版本

People This Week Next Week Task Tracking (DeadLine)
Dong Wang
  • Uyghur database paper, draft done.
  • ICME review, almost done.
  • MicroMagnetic paper, before final check.
Lantian Li
  • GPU status [1]
  • ASIP-BUPT (CohortTSE, SE-Adapter, SpeakerAug, NeuralScoring)
  • Huawei project (Phase 1st)
Ying Shi
Zhenghai You
Junming Yuan
  • MT-pretraining double check exp + extend exp[2]
    • Identified the influence with the BN layer in 10-shot/5-shot exp.
    • Extend a new pretrained model(training on clean data with BCE loss)
    • Report performance differences on fixing difference layers in finetuning.(after group meeting)
  • is24 paper writing
Chen Chen
  • reproduce robustness experiments [3]
  • is24 paper
Xiaolou Li
  • robustness experiments of AVSR system
  • is24 paper writing
Zehua Liu
Pengqi Li
  • [4] Attention supervise learning with Liuhuan
    • Confirm code for train step
    • But performance is not better than without supervise
    • Assume and Analysis
  • Jinfu and Xueying summarized previous work
Wan Lin
  • Neural Scoring [5]
Tianhao Wang
  • SE Adapter assumption verification exps [6]
    • assumption: entire fine-tuning = CNN refinement + SE adaptation
Zhenyu Zhou
  • Extensive Speaker Pertubation[7]:
    • VTLP results in cn1&vox1
    • VTLP+Speed results in cn1&vox1
    • Future Experiment Design
Junhui Chen
  • Neural scoring code debug
Jiaying Wang
  • experiments of cohort pit[8]
    • result comparison with other cohort choices with train-100 training set
Yu Zhang
  • financial-pipeline
    • portfolio analysis code
    • write doc
  • Fix some bugs found while self checking
  • Check out the entire process with Jun Wang
Wenqiang Du
  • Project coordination and related file archiving
  • Closing of the DiTing project
Yang Wei
  • Review some FreeNeb release directories for reference
  • Concurrence performance problem for Huilan ASR
Lily
  • Interspeech2024[9]
  • Journal paper draft preparation[10]