“ASR:2015-04-20”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
Sparse NN in NLP
Lr讨论 | 贡献
Sparse NN in NLP
第72行: 第72行:
 
* test the order feature ,need some result:
 
* test the order feature ,need some result:
 
* large dimension result:http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=lr&step=view_request&cvssid=344
 
* large dimension result:http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=lr&step=view_request&cvssid=344
:* sparse-nn on 1000 dimension(le-6,0.71) is better than 200 dimension(le-12,0.69).
+
:* sparse-nn on 1000 dimension(le-6,0.705236) is better than 200 dimension(le-12,0.694678).
  
 
===online learning===
 
===online learning===

2015年4月20日 (一) 04:47的版本

Speech Processing

AM development

Environment

  • grid-11 often shut down automatically, too slow computation speed.
  • add a server(760)

RNN AM


Mic-Array

  • investigate alpha parameter in time domian and frquency domain
  • ALPHA>=0, using data generated by reverber toolkit
  • consider theta


Convolutive network

  • HOLD
  • CNN + DNN feature fusion

RNN-DAE(Deep based Auto-Encode-RNN)


Speaker ID

Ivector based ASR

  • hold

Dark knowledge

bilingual recognition

Text Processing

tag LM

  • similar word extension in FST
  • will check the formula using Bayes and experiment
  • add similarity weight

RNN LM

  • rnn
  • test the ppl and code the character-lm
  • lstm+rnn
  • check the lstm-rnnlm code about how to Initialize and update learning rate.(hold)

W2V based document classification

  • result about norm model [1]
  • try CNN model

Translation

  • v5.0 demo released
  • cut the dict and use new segment-tool

Sparse NN in NLP

  • sparse-nn on 1000 dimension(le-6,0.705236) is better than 200 dimension(le-12,0.694678).

online learning

  • modified the listNet SGD

relation classifier

  • check the CNN code and contact the author of paper