“2024-03-04”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第213行: 第213行:
 
|Lily
 
|Lily
 
||
 
||
*  
+
* Paper reading
 +
* Prepare for overview paper
 
||
 
||
 
*
 
*

2024年3月4日 (一) 11:00的版本

People This Week Next Week Task Tracking (DeadLine)
Dong Wang
  • Interspeech papers
  • ICME review done
  • XXX project management
Lantian Li
  • GPU status [1]
  • ASIP-BUPT (CohortTSE, SE-Adapter, SpeakerAug, NeuralScoring)
  • INTERSPEECH 2024
Ying Shi
  • INTERSPEECH Paper here
Zhenghai You
Junming Yuan
  • INTERSPEECH 2024 Papers
    • the results still have problem in clean scenario (triple check tonight)
  • continue to work on INTERSPEECH 2024 papers
    • make additional experiment
Chen Chen
  • INTERSPEECH 2024 Papers
  • System Fusion for DeepFake Detection [2] GOOD RESULT!
  • continue to work on INTERSPEECH 2024 papers
Xiaolou Li
Zehua Liu
  • is24 papper writing
Pengqi Li
  • Team Work[3]
    • Synchronous experiments and hypotheses with Liuhuan about vad-free SID
    • JinFu:Update Linear transform for channel mismatch according the last discussion
    • A finding from XueYing: pretrianed model by long duration and then finetune by more short duration is work!
Wan Lin
  • neural scoring [4]
Tianhao Wang
  • INTERSPEECH 2024 Paper
    • Close-Set limited data SID testing ...
    • paper refinement ing ...
Zhenyu Zhou
  • INTERSPEECH 2024 Papers
  • Detailed comparisons and analyses of SP & VTLP[5]
Junhui Chen
  • Try to modify model structure for Neural Scoring
Jiaying Wang
  • report on last Friday
  • continue to verify the influence of speaker encoder and randomness of cohort
Yu Zhang
  • financial-pipeline
    • check logic with Jun Wang and debug
    • factor analysis visualization
  • portfolio analysis visualization
  • finish debug
Wenqiang Du
  • Aibabel
    • Actual scene noise (toilet, dormitory) collection
    • KWS model for CN Uyghur and train and cn and Uyghur Joint Model train and test(not compeleted)[6]
Yang Wei
  • Rearrange some customer project directory and prepare the release directory.
Lily
  • Paper reading
  • Prepare for overview paper
Turi