“RNN TEST”版本间的差异
来自cslt Wiki
(相同用户的7个中间修订版本未显示) | |||
第1行: | 第1行: | ||
==140901== | ==140901== | ||
− | wsj_data | + | |
− | 1. | + | === wsj_data === |
− | rand_seed= | + | |
− | + | 1. parameter | |
+ | rand_seed=1 | ||
nwords=10000 # This is how many words we're putting in the vocab of the RNNLM. | nwords=10000 # This is how many words we're putting in the vocab of the RNNLM. | ||
hidden=30 | hidden=30 | ||
第15行: | 第16行: | ||
3.data_size:200M | 3.data_size:200M | ||
4.data:np_data | 4.data:np_data | ||
+ | |||
+ | 1.parameter | ||
+ | rand_seed=1 | ||
+ | nwords=10000 # This is how many words we're putting in the vocab of the RNNLM. | ||
+ | hidden=320 | ||
+ | class=300 # Num-classes... should be somewhat larger than sqrt of nwords. | ||
+ | direct=2000 # Number of weights that are used for "direct" connections, in millions. | ||
+ | rnnlm_ver=rnnlm-0.3e # version of RNNLM to use | ||
+ | threads=1 # for RNNLM-HS | ||
+ | bptt=2 # length of BPTT unfolding in RNNLM | ||
+ | bptt_block=20 # length of BPTT unfolding in RNNLM | ||
+ | |||
+ | == daily work == | ||
+ | [[ 140902 ]] |
2014年9月2日 (二) 07:14的最后版本
140901
wsj_data
1. parameter rand_seed=1 nwords=10000 # This is how many words we're putting in the vocab of the RNNLM. hidden=30 class=200 # Num-classes... should be somewhat larger than sqrt of nwords. direct=1000 # Number of weights that are used for "direct" connections, in millions. rnnlm_ver=rnnlm-0.3e # version of RNNLM to use threads=1 # for RNNLM-HS bptt=2 # length of BPTT unfolding in RNNLM bptt_block=20 # length of BPTT unfolding in RNNLM 2.time:3-hour 3.data_size:200M 4.data:np_data
1.parameter rand_seed=1 nwords=10000 # This is how many words we're putting in the vocab of the RNNLM. hidden=320 class=300 # Num-classes... should be somewhat larger than sqrt of nwords. direct=2000 # Number of weights that are used for "direct" connections, in millions. rnnlm_ver=rnnlm-0.3e # version of RNNLM to use threads=1 # for RNNLM-HS bptt=2 # length of BPTT unfolding in RNNLM bptt_block=20 # length of BPTT unfolding in RNNLM