“2018-12-12”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
 
(4位用户的6个中间修订版本未显示)
第10行: 第10行:
 
|Yibo Liu
 
|Yibo Liu
 
||  
 
||  
*  
+
* Read how code of poem_acl2017 different from code of NMT.
 +
* Checked book reference.
 
||
 
||
*  
+
* Finish modifying code of poem_acl2017 on memory module.
 
||
 
||
 
*   
 
*   
第23行: 第24行:
 
|Xiuqi Jiang
 
|Xiuqi Jiang
 
||  
 
||  
* Learned TensorFlow documents about seq2seq model and made adjustments on vivi, especially the buckets parts
+
* Learned TensorFlow documents about seq2seq model and made adjustments on vivi's params
 
* Had some problems on encoder-memory and decoder-memory, and was trying to figure out the difference
 
* Had some problems on encoder-memory and decoder-memory, and was trying to figure out the difference
 
||
 
||
第38行: 第39行:
 
|Jiayao Wu
 
|Jiayao Wu
 
||  
 
||  
*  
+
* run the AISHELL2 baseline
 +
* read a few papers about low-rank matrix factorization for NN
 
||
 
||
*  
+
* organize the fragment information and form a knowledge system of compression as soon as possible
 
||  
 
||  
 
*  
 
*  
第51行: 第53行:
 
|Zhaodi Qi
 
|Zhaodi Qi
 
||  
 
||  
*  
+
* attend an academic conference
 
||
 
||
*
+
*keep on research of DID
 
||
 
||
*
+
*  
 
|-
 
|-
  
第64行: 第66行:
 
|Jiawei Yu
 
|Jiawei Yu
 
||  
 
||  
*  
+
* Read attention recipe on kaldi and try to establish baseline use thch30
 +
* read the "PHONETIC-ATTENTION SCORING FOR DEEP SPEAKER FEATURES IN SPEAKER VERIFICATION" and “GAUSSIAN-CONSTRAINED TRAINING FOR SPEAKER VERIFICATION”
 
||
 
||
*  
+
* finish attention baseline
 
||
 
||
 
*   
 
*   
第75行: 第78行:
 
|Yunqi Cai
 
|Yunqi Cai
 
||  
 
||  
*  
+
* Read "Introduction to modern machine learning technology"
 +
*Check the ref.
 +
*do some preparation about how to use the kaldi
 
||
 
||
*  
+
* run the demo of thchs30
 
||
 
||
 
*  
 
*  

2018年12月12日 (三) 03:50的最后版本

People Last Week This Week Task Tracking (DeadLine)
Yibo Liu
  • Read how code of poem_acl2017 different from code of NMT.
  • Checked book reference.
  • Finish modifying code of poem_acl2017 on memory module.
Xiuqi Jiang
  • Learned TensorFlow documents about seq2seq model and made adjustments on vivi's params
  • Had some problems on encoder-memory and decoder-memory, and was trying to figure out the difference
  • Make adjustments on vivi
  • Reproduce codes on Song Iambics generation
Jiayao Wu
  • run the AISHELL2 baseline
  • read a few papers about low-rank matrix factorization for NN
  • organize the fragment information and form a knowledge system of compression as soon as possible
Zhaodi Qi
  • attend an academic conference
  • keep on research of DID
Jiawei Yu
  • Read attention recipe on kaldi and try to establish baseline use thch30
  • read the "PHONETIC-ATTENTION SCORING FOR DEEP SPEAKER FEATURES IN SPEAKER VERIFICATION" and “GAUSSIAN-CONSTRAINED TRAINING FOR SPEAKER VERIFICATION”
  • finish attention baseline
Yunqi Cai
  • Read "Introduction to modern machine learning technology"
  • Check the ref.
  • do some preparation about how to use the kaldi
  • run the demo of thchs30
Dan He
  • Read papers about tensor decompositions
  • Prepared a paper sharing about tensorizing neural work
  • Run the code for the experimental part of the paper
  • Learn more about the details
Yang Zhang
  • 1.refactored my code and accelerated VPR speed.
  • 2.modified some references error in Machine Learning book.
  • try to release a stable app as soon as possible.