“Weekly reading”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
第91行: 第91行:
  
 
*[[媒体文件:2015 Transitive Transfer Learning.pdf|汤志远 2015-8-26 Transitive Transfer Learning]]
 
*[[媒体文件:2015 Transitive Transfer Learning.pdf|汤志远 2015-8-26 Transitive Transfer Learning]]
 +
 +
*[[媒体文件:2015 Supervised Transfer Sparse Coding.pdf|汤志远 2015-8-26 Supervised Transfer Sparse Coding]]
  
 
*[[媒体文件:2015_EESEN-End-to-end speech recognition using deep rnn models and WFST-based decoding.pdf|张之勇 2015-8-26 2015_EESEN-End-to-end speech recognition using deep rnn models and WFST-based decoding.pdf]]
 
*[[媒体文件:2015_EESEN-End-to-end speech recognition using deep rnn models and WFST-based decoding.pdf|张之勇 2015-8-26 2015_EESEN-End-to-end speech recognition using deep rnn models and WFST-based decoding.pdf]]
  
 
*[[媒体文件:2015_Dither is better than dropout for regularising deep neural networks.pdf|张之勇 2015-8-26 2015_Dither is better than dropout for regularising deep neural networks.pdf]]
 
*[[媒体文件:2015_Dither is better than dropout for regularising deep neural networks.pdf|张之勇 2015-8-26 2015_Dither is better than dropout for regularising deep neural networks.pdf]]

2015年8月26日 (三) 06:58的版本