Chao Xing

来自cslt Wiki
2015年10月7日 (三) 09:04Xingchao讨论 | 贡献的版本

(差异) ←上一版本 | 最后版本 (差异) | 下一版本→ (差异)
跳转至: 导航搜索

Last Week: Last Week Solution:

 1. DNN program has some problems, fix these & DNN contain same performance to Linear.
    Problem is training process, training samples are equal to speakers' so this will lead model less compute 50% training samples.

Plan to do:

 (Hold)

Problem:

 1. DNN still not better than linear model.
 


This Week:

Solution:
 1. DNN with dropout method, and add SAA to reduce randomly properties.
Plan to do:
 1. Run for a small vocabulary size, such as 5000.
 2. Run and test some ideas.