“Huilian-cslt-week-2016-1-14”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“==Environment== ==RNN Poem Processing== * attention training from Renmin newspaper done --Qixin Wang done * change corpus done * contact specialists of Song Pe...”为内容创建页面)
 
Zxw讨论 | 贡献
第7行: 第7行:
 
* contact  specialists of Song Peams done  
 
* contact  specialists of Song Peams done  
 
* write a patent  done
 
* write a patent  done
* submit a patent --Xinkai(1.15)
+
* submit a patent done
 
* demo release(unified model) done
 
* demo release(unified model) done
 
* add to language model information
 
* add to language model information
* crowdsourcing platform check  
+
* contact specialists
 +
* train and check    
  
 
==RNN Writing Processing==
 
==RNN Writing Processing==
第18行: 第19行:
 
:* train using chinese dataset done  
 
:* train using chinese dataset done  
 
* train reference to the paper of BAYESIAN PROBABILISTIC PROGRAM FOR IMAGE GENERATION  
 
* train reference to the paper of BAYESIAN PROBABILISTIC PROGRAM FOR IMAGE GENERATION  
 +
* source code realsed
  
 
==RNN QA==
 
==RNN QA==

2016年1月14日 (四) 01:56的版本

Environment

RNN Poem Processing

  • attention training from Renmin newspaper done --Qixin Wang done
  • change corpus done
  • contact specialists of Song Peams done
  • write a patent done
  • submit a patent done
  • demo release(unified model) done
  • add to language model information
  • contact specialists
  • train and check

RNN Writing Processing

  • write code done -- Chao Xing
  • generate Xing style using variational RNN
  • use mnist dataset done
  • train using chinese dataset done
  • train reference to the paper of BAYESIAN PROBABILISTIC PROGRAM FOR IMAGE GENERATION
  • source code realsed

RNN QA

  • rnn rescore
  • rnn rescore should be used as a factor for learning to rank
  • similar question --Bingdong
    • Huilan test -- Bingdong
      • train corpus of word vector--hulian
      • add traditional method of learning to rank--Tianyi
  • memory machine(about three months)--Tianyi

Couplet

  • attention training done --Tianyi Luo
  • focused attending training
  • revise training corpus