“Sinovoice-2016-5-26”版本间的差异
来自cslt Wiki
(→Project) |
|||
(1位用户的9个中间修订版本未显示) | |||
第18行: | 第18行: | ||
:*100h User data done | :*100h User data done | ||
− | ==Model | + | ==Model Training== |
− | ==Deletion Error | + | ===Deletion Error Problem=== |
− | * Add one noise phone to alleviate the silence over-training | + | * Add one noise phone to alleviate the silence over-training, looks OK. |
* Omit sil accuracy in discriminative training | * Omit sil accuracy in discriminative training | ||
− | * H smoothing of XEnt and MPE | + | * H smoothing of XEnt and MPE, no significant affect. |
* Add one silence arc from start-state to end-state | * Add one silence arc from start-state to end-state | ||
===Big-Model Training=== | ===Big-Model Training=== | ||
====16k==== | ====16k==== | ||
− | + | * Done! | |
====8k===== | ====8k===== | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
=====Project===== | =====Project===== | ||
* PingAn | * PingAn | ||
− | |||
========================================================================================= | ========================================================================================= | ||
第62行: | 第54行: | ||
| spn 7-1024 MPE adapt-PA_user Hs 2e-5| 15.32 | 32.24 || 20.93 | | | spn 7-1024 MPE adapt-PA_user Hs 2e-5| 15.32 | 32.24 || 20.93 | | ||
========================================================================================= | ========================================================================================= | ||
+ | |||
+ | ===================================================================================================== | ||
+ | | LM / config | KeHu |KeHu check_zxm_recheck| KeHu final | | ||
+ | ----------------------------------------------------------------------------------------------------- | ||
+ | | baseline | 20.14 | 19.40 | 18.26 | | ||
+ | ----------------------------------------------------------------------------------------------------- | ||
+ | |bank+baoxian.chart+word.w0.9 | - | 19.27 | 17.88 | | ||
+ | |bank+baoxian+guojiadianwang.chart+word.w0.9| - | 19.02 | 17.94 | | ||
+ | |bank+baoxian+guojiadianwang_w0.9 | - | 19.02 | - | | ||
+ | |bank+baoxian_w0.9 | - | 19.20 | - | | ||
+ | |baoxian+bank_w0.9 | - | 19.02 | - | | ||
+ | |baoxian+user200h.chart+word.w0.9 | - | 19.20 | - | | ||
+ | |baoxian+user200h_w0.8 | - | 19.02 | - | | ||
+ | |baoxian+user200h_w0.9 | - | 19.08 | 18.01 | | ||
+ | |baoxian+user200h.w0.9w0.9 | - | 19.08 | - | | ||
+ | |baoxian+user200h.chart1e-7.w0.9w0.1 | - | - | 23.82 | | ||
+ | |baoxian+user200h.chart1e-7.w0.9w0.9 | - | - | 18.26 | | ||
+ | ===================================================================================================== | ||
第118行: | 第128行: | ||
* 2-step decoding: first, character-based LM. Then, word-based LM. | * 2-step decoding: first, character-based LM. Then, word-based LM. | ||
− | === | + | ===Problem=== |
− | + | * Pingan & Yueyu too much deletion error. | |
− | + | ||
− | + | ||
− | + | ||
− | * Pingan & Yueyu | + | |
:* TDNN deletion error rate > DNN deletion error rate | :* TDNN deletion error rate > DNN deletion error rate | ||
:* TDNN Silence scale is too sensitive for different test cases. | :* TDNN Silence scale is too sensitive for different test cases. | ||
+ | * cmvn causes performance reduction. | ||
+ | |||
+ | ==SiaSun Robot== | ||
+ | * Beam-forming algorithm test | ||
+ | * NN-model based beam-forming | ||
==SID== | ==SID== | ||
===Digit=== | ===Digit=== | ||
* Engine Package | * Engine Package |
2016年5月26日 (四) 06:30的最后版本
目录
Data
- 16K LingYun
- 2000h data ready
- 4300h real-env data to label
- YueYu
- Total 250h(190h-YueYu + 60h-English)
- Add 60h YueYu
- CER: 75%->76%
- WeiYu
- 8k more data
- 50h for training
- 120h labeled ready
- PingAn
- 100h User data done
Model Training
Deletion Error Problem
- Add one noise phone to alleviate the silence over-training, looks OK.
- Omit sil accuracy in discriminative training
- H smoothing of XEnt and MPE, no significant affect.
- Add one silence arc from start-state to end-state
Big-Model Training
16k
- Done!
8k=
Project
- PingAn
========================================================================================= | AM / config | all | KeHu wer || KeHu no-ins | ----------------------------------------------------------------------------------------- | tdnn 7-2048 xEnt | 16.45 | 36.49 || 25.18 | | tdnn 7-2048 MPE | 15.22 | 32.77 || 23.48 | | tdnn 7-2048 MPE adapt-PABX | 14.67 | 31.33 || 22.76 | ----------------------------------------------------------------------------------------- | tdnn 7-1024 xEnt | 16.60 | 35.91 || 25.58 | | tdnn 7-1024 MPE 2e-6 | 15.67 | 32.77 || 26.09 | | tdnn 7-1024 MPE 2e-5 1.mdl | 15.54 | 32.77 || 26.29 | | tdnn 7-1024 MPE 1e-5 4.mdl | 15.76 | 33.55 || 27.20 | | tdnn 7-1024 MPE adapt-PABX | 14.80 | 30.48 || 22.56 | ----------------------------------------------------------------------------------------- | spn 7-1024 xEnt | 16.49 | 36.23 || 24.59 | | spn 7-1024 xEnt xEnt-PA_user 101.mdl| 16.19 | 33.22 || 22.69 | | spn 7-1024 xEnt xEnt-PA_user mpe | 15.24 | 32.77 || 21.65 | | spn 7-1024 MPE-1000H 23.mdl | 15.29 | 33.09 || 21.65 | | spn 7-1024 MPE adapt-PA_all 29.mdl | 15.11 | 33.42 || 21.84 | | spn 7-1024 MPE adapt-PA_user 2e-5 | 15.31 | 31.79 || 20.14 | | spn 7-1024 MPE adapt-PA_user Hs 2e-5| 15.32 | 32.24 || 20.93 | =========================================================================================
===================================================================================================== | LM / config | KeHu |KeHu check_zxm_recheck| KeHu final | ----------------------------------------------------------------------------------------------------- | baseline | 20.14 | 19.40 | 18.26 | ----------------------------------------------------------------------------------------------------- |bank+baoxian.chart+word.w0.9 | - | 19.27 | 17.88 | |bank+baoxian+guojiadianwang.chart+word.w0.9| - | 19.02 | 17.94 | |bank+baoxian+guojiadianwang_w0.9 | - | 19.02 | - | |bank+baoxian_w0.9 | - | 19.20 | - | |baoxian+bank_w0.9 | - | 19.02 | - | |baoxian+user200h.chart+word.w0.9 | - | 19.20 | - | |baoxian+user200h_w0.8 | - | 19.02 | - | |baoxian+user200h_w0.9 | - | 19.08 | 18.01 | |baoxian+user200h.w0.9w0.9 | - | 19.08 | - | |baoxian+user200h.chart1e-7.w0.9w0.1 | - | - | 23.82 | |baoxian+user200h.chart1e-7.w0.9w0.9 | - | - | 18.26 | =====================================================================================================
- LiaoNingYiDong:
========================================================================= | AM / config | LNYD | LNYD re-tag | ------------------------------------------------------------------------- | tdnn 7-2048 xEnt | 21.51 | | | tdnn 7-2048 MPE | 20.09 | | | tdnn 7-2048 MPE adapt-LNYD | 17.92 | 16.29 | ------------------------------------------------------------------------- | tdnn 7-1024 xEnt | 21.72 | | | tdnn 7-1024 MPE | 20.99 | | | cnn 7-1024 xEnt 600.mdl | 21.03 | | | cnn 7-1024 MPE 12.mdl | 19.80 | | | cnn 7-1024 MPE adapt-LNYD 41.mdl | 17.96 | 15.93 | ------------------------------------------------------------------------- | spn 7-1024 xEnt | 21.70 | | | spn 7-1024 MPE-1000H 23.mdl | 19.97 | | | spn 7-1024 MPE adapt-LNYD | 18.67 | | | spn cnn 7-1024 xEnt 300.mdl | 22.26 | | ========================================================================
Embedding
- The size of nnet1 AM is 6.4M (3M after decomposition). So we need to control AM size within 10M.
- 5*500-2400 TDNN no-svd/svd100 model, MPE training done
LM=1e-5, beam=9, max-active=5000 ============================================================================================================= | AM / testset | test_1000ju | test_2000ju | test_8000ju | test_10000ju | ------------------------------------------------------------------------------------------------------------- | nnet1 4*600+800 xEnt (6.4M) | 25.30 | 40.48 | | | | nnet1 4*600+800 mpe (6.4M) | 20.75 | 35.33 | | | ------------------------------------------------------------------------------------------------------------- | nnet3 5*500 mpe (13M) | 16.18 | 29.53 | | | | nnet3 5*500 svd-100 mpe (9.5M) | 17.69 | 30.11 | | | =============================================================================================================
Character LM
- Except Sogou-2T, 9-gram has been done.
- Add word boundary tag to Character-LM trainig done
- 9-gram
- Except Weibo & Sogou-2T
- 1e-7(13M) wer17.91 compared with 1e-7(no-boundary,71M) 13.4
- 1e-8(54M) wer17.54
- Prepare specific domain vocabulary
- Dianxin/Baoxian/Dianli
- DT lm training
- ReFr
- Merge Character-LM & word-LM
- Union
- Compose, success.
- 2-step decoding: first, character-based LM. Then, word-based LM.
Problem
- Pingan & Yueyu too much deletion error.
- TDNN deletion error rate > DNN deletion error rate
- TDNN Silence scale is too sensitive for different test cases.
- cmvn causes performance reduction.
SiaSun Robot
- Beam-forming algorithm test
- NN-model based beam-forming
SID
Digit
- Engine Package