“Tianyi Luo 2015-12-21”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
Lty讨论 | 贡献
第45行: 第45行:
 
* To finish the work about make the lab's demo.
 
* To finish the work about make the lab's demo.
 
* To finish the work about the poem and couplet generation's SMT method.
 
* To finish the work about the poem and couplet generation's SMT method.
 +
 +
Interested papers:
 +
*Cascading Bandits: Learning to Rank in the Cascade Model(ICML 2015) [[http://zheng-wen.com/Cascading_Bandit_Paper.pdf pdf]]

2015年12月25日 (五) 08:17的版本

Plan to do next week

  • Enhance the function of couplet generation function.
  • To conduct the experiments to submit a journal.
  • To try new kernel function to model candidate similarity more efficiently.

Work done in this week

  • Finish some parts of work about make the lab's demo.
  • Finish some parts of work about the poem and couplet generation's SMT method.
  • Finish the work about local-based attention Chinese couplet generation.

开 业 大 吉:

Non-local attention-based:

启 步 肇 昌 隆 / 花 荣 名 畅 三 /

Local attention-based:

妙 墨 系 春 秋 / 名 来 昌 畅 花 /


同 行 增 劲 旅:

training corpus:同 行 增 劲 旅 / 商 界 跃 新 军 /  ; 上 沃 群 芳 艳 / 国 宁 百 艺 生 /

test result:

Non-local attention-based:

attention of 商 [ 0.00851025, 0.05046642, 0.20085089, 0.52851975, 0.12252463, 0.06678692, 0.02234111]

attention of 军 [ 0.00760446, 0.04773411, 0.20061702, 0.54270059, 0.11813291, 0.06291854, 0.02029242]

attention of 胜 [ 0.00872168, 0.05112754, 0.20125151, 0.52559483, 0.12306171, 0.06754488, 0.02269783]

attention of 旧 [ 0.00775181, 0.04831868, 0.20060426, 0.54085833, 0.11861438, 0.06334573, 0.02050681]

attention of 来 [ 0.0080967, 0.04938305, 0.20097148, 0.53481424, 0.12035656, 0.06504875, 0.02132925]

同 行 增 劲 旅 / 商 军 胜 旧 来 /

Local attention-based:

同 行 增 劲 旅 / 商 宁 百 艺 生 /

Plan to do next week

  • To finish the work about make the lab's demo.
  • To finish the work about the poem and couplet generation's SMT method.

Interested papers:

  • Cascading Bandits: Learning to Rank in the Cascade Model(ICML 2015) [pdf]