“Tianyi Luo 2015-08-17”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
Lty讨论 | 贡献
 
第7行: 第7行:
 
* Be prepare and make a report to Huilian in 2015-08-13.
 
* Be prepare and make a report to Huilian in 2015-08-13.
 
* Summit the final version of EMNLP 2015 in 2015-08-15.
 
* Summit the final version of EMNLP 2015 in 2015-08-15.
* Conduct uv decomposition(100 iterations and d = 1000. 170 thousand users, 30 thousand items and 20 million ratings) in giant-1,  
+
* Conduct uv decomposition(100 iterations and d = 1000. 170 thousand users, 30 thousand items and 20 million ratings) in giant-1, it run 94 hours and did not stop. I think I should thingk about sparse decomposition solution.
it run 94 hours and did not stop. I think I should thingk about sparse decomposition solution.
+
 
=== Plan to do next week ===
 
=== Plan to do next week ===
 
* To release the sentence vector of RNN with baidu's corpus about similar questions.
 
* To release the sentence vector of RNN with baidu's corpus about similar questions.

2015年8月17日 (一) 01:08的最后版本

Plan last week

  • To summit the final version of EMNLP 2015.
  • To release the sentence vector of RNN with baidu's corpus about similar questions.
  • To use the vectors of UV decomposition as the dark knowledge of DNN.
  • To implement other methods to get latent user and item vectors as the dark knowledge of NN.

Work done in this week

  • Be prepare and make a report to Huilian in 2015-08-13.
  • Summit the final version of EMNLP 2015 in 2015-08-15.
  • Conduct uv decomposition(100 iterations and d = 1000. 170 thousand users, 30 thousand items and 20 million ratings) in giant-1, it run 94 hours and did not stop. I think I should thingk about sparse decomposition solution.

Plan to do next week

  • To release the sentence vector of RNN with baidu's corpus about similar questions.
  • To use the vectors of UV decomposition as the dark knowledge of DNN.
  • To implement other methods to get latent user and item vectors as the dark knowledge of NN.
  • To thingk about how to modify <<Learning from LDA using Deep Neural Networks>>.