“ASR:2015-01-26”版本间的差异
来自cslt Wiki
(以“==Speech Processing == === AM development === ==== Environment ==== * May gpu760 of grid-14 be something wrong. To be exchanged. * grid-11 often shutdown automatica...”为内容创建页面) |
|||
(某位用户的一个中间修订版本未显示) | |||
第3行: | 第3行: | ||
==== Environment ==== | ==== Environment ==== | ||
− | * May gpu760 of grid-14 | + | * May gpu760 of grid-14 has been repairing. |
− | * grid-11 often shutdown automatically | + | * grid-11 often shutdown automatically, too slow computation speed. |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
==== RNN AM==== | ==== RNN AM==== | ||
− | |||
* details at http://liuc.cslt.org/pages/rnnam.html | * details at http://liuc.cslt.org/pages/rnnam.html | ||
− | ====Dropout & Maxout & | + | ====Dropout & Maxout & rectifier ==== |
− | + | * Need to solve the too small learning-rate problem | |
− | + | * 20h small scale sparse dnn with rectifier. --Chao liu | |
− | + | * 20h small scale sparse dnn with Maxout/rectifier based on weight-magnitude-pruning. --Mengyuan Zhao | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
====Convolutive network==== | ====Convolutive network==== | ||
* Convolutive network(DAE) | * Convolutive network(DAE) | ||
:* http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=311 | :* http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=311 | ||
− | :* | + | :* Technical report to draft, Mian Wang, Yiye Lin, Shi Yin, Mengyuan Zhao |
− | + | ====DNN-DAE(Deep Auto-Encode-DNN)==== | |
− | + | * Technical report to draft, Xiangyu Zeng, Shi Yin, Mengyuan Zhao and Zhiyong Zhang, | |
+ | * http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=318 | ||
− | + | ====RNN-DAE(Deep based Auto-Encode-RNN)==== | |
− | + | * http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=261 | |
− | + | * HOLD | |
− | + | ||
− | + | ||
− | ====RNN-DAE(Deep based | + | |
− | + | ||
− | + | ||
====VAD==== | ====VAD==== | ||
− | * | + | * DAE |
− | * | + | * Technical report --Shi Yin |
− | + | ||
====Speech rate training==== | ====Speech rate training==== | ||
第82行: | 第67行: | ||
:* mix the sougou2T-lm,kn-discount continue | :* mix the sougou2T-lm,kn-discount continue | ||
:* train a large lm using 25w-dict.(hanzhenglong/wxx) | :* train a large lm using 25w-dict.(hanzhenglong/wxx) | ||
− | ::* | + | ::* add more data including poi, document information. |
− | ::* the | + | ::* add v1.0 vocab and filter the useless word |
+ | ::* set the test set | ||
====tag LM==== | ====tag LM==== | ||
* Tag Lm | * Tag Lm | ||
:* tag Probability should test add the weight(hanzhenglong) and handover to hanzhenglong ("this month") | :* tag Probability should test add the weight(hanzhenglong) and handover to hanzhenglong ("this month") | ||
− | |||
− | |||
− | |||
* similar word extension in FST | * similar word extension in FST | ||
− | |||
− | |||
− | |||
:* write a draft of a paper | :* write a draft of a paper | ||
− | + | :* result :16.32->10.23 | |
====RNN LM==== | ====RNN LM==== | ||
*rnn | *rnn | ||
第110行: | 第90行: | ||
====Knowledge vector==== | ====Knowledge vector==== | ||
* Knowledge vector | * Knowledge vector | ||
− | :* | + | :* run the big data |
− | :* | + | :* prepare the paper. |
− | + | ||
− | + | ||
* result | * result | ||
− | |||
====Character to word==== | ====Character to word==== | ||
* Character to word conversion(hold) | * Character to word conversion(hold) | ||
第124行: | 第101行: | ||
===Sparse NN in NLP=== | ===Sparse NN in NLP=== | ||
− | * | + | * write a technical report |
===QA=== | ===QA=== | ||
第133行: | 第110行: | ||
::* POS, NER ,tf ,idf .. | ::* POS, NER ,tf ,idf .. | ||
:* extract more features about lexical, syntactic and semantic to improve re-ranking performance. | :* extract more features about lexical, syntactic and semantic to improve re-ranking performance. | ||
− | + | :* using sentence vector | |
====context framework==== | ====context framework==== | ||
* code for organization | * code for organization |
2015年1月30日 (五) 02:16的最后版本
目录
Speech Processing
AM development
Environment
- May gpu760 of grid-14 has been repairing.
- grid-11 often shutdown automatically, too slow computation speed.
RNN AM
- details at http://liuc.cslt.org/pages/rnnam.html
Dropout & Maxout & rectifier
- Need to solve the too small learning-rate problem
- 20h small scale sparse dnn with rectifier. --Chao liu
- 20h small scale sparse dnn with Maxout/rectifier based on weight-magnitude-pruning. --Mengyuan Zhao
Convolutive network
- Convolutive network(DAE)
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=311
- Technical report to draft, Mian Wang, Yiye Lin, Shi Yin, Mengyuan Zhao
DNN-DAE(Deep Auto-Encode-DNN)
- Technical report to draft, Xiangyu Zeng, Shi Yin, Mengyuan Zhao and Zhiyong Zhang,
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=318
RNN-DAE(Deep based Auto-Encode-RNN)
VAD
- DAE
- Technical report --Shi Yin
Speech rate training
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=268
- Technical report to draft. Shi Yin
- Prepare for ChinaSIP
Confidence
- Reproduce the experiments on fisher dataset.
- Use the fisher DNN model to decode all-wsj dataset
- preparing scoring for puqiang data
- HOLD
Neural network visulization
Speaker ID
Language ID
- GMM-based language is ready.
- Delivered to Jietong
- Prepare the test-case
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=328
Voice Conversion
- Yiye is reading materials
- HOLD
Text Processing
LM development
Domain specific LM
- LM2.1
- mix the sougou2T-lm,kn-discount continue
- train a large lm using 25w-dict.(hanzhenglong/wxx)
- add more data including poi, document information.
- add v1.0 vocab and filter the useless word
- set the test set
tag LM
- Tag Lm
- tag Probability should test add the weight(hanzhenglong) and handover to hanzhenglong ("this month")
- similar word extension in FST
- write a draft of a paper
- result :16.32->10.23
RNN LM
- rnn
- test wer RNNLM on Chinese data from jietong-data
- generate the ngram model from rnnlm and test the ppl with different size txt.
- lstm+rnn
- check the lstm-rnnlm code about how to Initialize and update learning rate.(hold)
Word2Vector
W2V based doc classification
- data prepare.
Knowledge vector
- Knowledge vector
- run the big data
- prepare the paper.
- result
Character to word
- Character to word conversion(hold)
Translation
- v5.0 demo released
- cut the dict and use new segment-tool
Sparse NN in NLP
- write a technical report
QA
improve fuzzy match
- add Synonyms similarity using MERT-4 method(hold)
improve lucene search
- add more feature to improve search.
- POS, NER ,tf ,idf ..
- extract more features about lexical, syntactic and semantic to improve re-ranking performance.
- using sentence vector
context framework
- code for organization
- change to knowledge graph
query normalization
- using NER to normalize the word
- new inter will install SEMPRE