Tianyi Luo 2015-12-28

来自cslt Wiki
跳转至: 导航搜索

Plan to do next week

  • To finish the work about making the lab's demo.
  • To try new kernel function to model candidate similarity more efficiently.

Work done in this week

  • Finish some parts of work about making the lab's demo.
  • Finish the work about local-based attention Chinese couplet generation model.

开 业 大 吉:

同 行 增 劲 旅:

training corpus:同 行 增 劲 旅 / 商 界 跃 新 军 /  ;

test result:

Non-local attention-based:

[ 0.15731922 0.15440576 0.154654 0.13509884 0.13408586 0.13055836 0.13387793] [ 0.15748511 0.15446058 0.15466693 0.13504058 0.1340386 0.13050689 0.13380134] [ 0.15726063 0.15442531 0.15467082 0.13510644 0.13408728 0.13055961 0.1338899 ] [ 0.15715003 0.15439823 0.15466341 0.13514642 0.13413033 0.13059665 0.13391495] [ 0.15717115 0.15440425 0.15468264 0.13513321 0.13412073 0.13058177 0.13390623]

同 行 增 劲 旅 / 春 风 送 四 季 /

Local attention-based:

同 行 增 劲 旅 / 人 情 安 四 春 /

Plan to do next week

  • To finish the work about making the lab's demo.
  • To finish the work about the SMT method implementation of the poem generation.
  • To tackle the problem of attention-based programe.
  • To implement the reading comprehension qa system.
  • To extract the SMT features to enhance the function of poem generation and songci generation.

Interested papers

  • Cascading Bandits: Learning to Rank in the Cascade Model(ICML 2015) [pdf]
  • Neural Machine Translation by Joint Learning to Align and Translate(ICLR 2015)[pdf]