2013-11-15

来自cslt Wiki
2013年11月15日 (五) 02:32Cslt讨论 | 贡献的版本

(差异) ←上一版本 | 最后版本 (差异) | 下一版本→ (差异)
跳转至: 导航搜索

Data sharing

  • LM count files still undelivered!

AM development

Sparse DNN

  • Optimal Brain Damage(OBD).
  • Online OBD.
  • Try 3 configurations: batch size=256, 13000 (10 prunings), whole data. The current results show that the the performance order is: whole data > 256 > 13000.


Noisy training

  • Simulated Annealing training.
  • Rejected with small noises. With clean training rejected after annealing.
  • Noise concentrated training


Tencent exps

N/A


LM development

NN LM

QA LM

  1. Tencent word segmentation system ready.
  2. Collecting data for Q-LM training.