“Chao Xing 2016-02-01”版本间的差异
来自cslt Wiki
(以“LAST WEEK ---- Done: *Multi-task transfer learning during transfer-step pre-train. *Find some strategies among each hyper-parameters. **Steps determine the train pa...”为内容创建页面) |
(没有差异)
|
2016年2月1日 (一) 02:20的版本
LAST WEEK
Done:
- Multi-task transfer learning during transfer-step pre-train.
- Find some strategies among each hyper-parameters.
- Steps determine the train parameters information content, which means larger steps need larger iterations and larger learning rate.
- Learning rate should not decay in this program.
- Larger learning may obtain better performance.
THIS WEEK
Want to do:
- In my test:
- Steps must consist among source images pretrain, target images pretrain, transfer pretrain and fine tune.
- In transfer pretrain learning rate could have a larger number lead iteration count could be small, for last two days experiment, I choose 1e-2 as transfer learning rate.
- In fine tune step, the learning rate should be as small as possible but not too small, in this experiment I choose 1e-3.
- So this week I will focus on this experiment.