“2013-10-18”版本间的差异
来自cslt Wiki
(→Sparse DNN) |
(→Noisy training) |
||
第16行: | 第16行: | ||
1. With 863 clean test, by adding car & white noise at various levels, obtained significant performance improvement. | 1. With 863 clean test, by adding car & white noise at various levels, obtained significant performance improvement. | ||
+ | |||
+ | * car noise test | ||
+ | |||
+ | |||
2. The test with both car & white noise benefits from the noisy training. | 2. The test with both car & white noise benefits from the noisy training. |
2013年10月18日 (五) 09:25的版本
目录
Data sharing
- LM count files still undelivered!
DNN progress
Sparse DNN
- Optimal Brain Damage(OBD). The initial test shows worse results in weight-cutting experiments compared with simple weight-based cutting.
Tencent exps
N/A
Noisy training
1. With 863 clean test, by adding car & white noise at various levels, obtained significant performance improvement.
- car noise test
2. The test with both car & white noise benefits from the noisy training.
Continuous LM
1. Lattice rescoring toolkit is ready. 2. Rescoring is slow with some dense lattices.
QA LM
1. use the QA word segment system 2. train the Q LM & QA ASR system